Unbridled Learning’s Writing Program Reviews still inflated
One of the major credibility issues with Kentucky’s Unbridled Learning school accountability system has been the obvious inflation in the Program Reviews. Program Reviews are self-scored by the staff in each school, which naturally sets up a chance for human nature to inflate scores.
So far, that has been exactly what seems to be happening, especially in the case of the Writing Program Review.
Most of the Program Reviews are in non-tested areas like Arts and Humanities where there are no practical test scores for comparisons. However, in the case of writing, we can compare the Writing Program Review classifications each school receives to the percentages of students in each school that score at or above Proficient on the state’s writing on demand tests.
I should note that this discussion is particularly pertinent because the Kentucky Department of Education is currently working to keep Program Reviews in the new state accountability system that will replace Unbridled Learning this year. At present, Kentucky law requires three of the five Program Reviews, so the department would have to ask for legislative relief to drop them.
Unfortunately, given the obvious conflict with basic human nature, I just don’t see Program Reviews ever working out well. Self-awarded scores are always going to be subject to inflation, and that could corrupt the new assessment and accountability system even before the first report comes out next year.
Now would be a good time to act as the new Unbridled Learning results show major problems with at least the Writing Program Review continue this year. Click the “Read more” link to see some pretty disturbing examples of how some Writing Program Review scores just don’t pass muster.
I just made a comparison for the 676 Kentucky elementary schools that had both a 2015-16 Program Review score and a writing on demand score in the “Data Sets” section of the Kentucky School Report Cards web site. This table shows the schools with the very highest and very lowest proficiency rates for the writing on demand student assessment along with the classification that teachers self-awarded for the schools’ writing programs.
Interestingly, most of the schools at the very top for actual writing on demand proficiency didn’t award themselves the maximum score for their Writing Program Review. Only one of the ten, the Johnson Elementary School, gave itself the top mark of “Distinguished,” though in this case it seems richly deserved.
The real problems show up at the other end of the table. This is where the schools with the very lowest student writing on demand proficiency rates are found. The glaringly worst case is the Silver Grove School, which self-awarded a Writing Program Review classification of “Proficient” even though none of its students scored proficient or more for writing on demand. That just isn’t credible.
Silver Grove is far from alone in the inflation of Program Review action, either. The majority of the bottom 10 schools for actual writing on demand proficiency rates – seven of them (shown with pink backgrounds) – self-awarded Writing Program Review scores of “Proficient,” too.
Just above the bottom 10 in our full Excel Spreadsheet, the Cordia School in Knott County awarded itself a “Distinguished” score for its Writing Program Review although only 6.7 percent of its students scored proficient or more in writing on demand in 2015-16.Overall, a total of 45 schools self-awarded the top classification of “Distinguished” to themselves for Writing Program Review. Sadly, almost half of these schools, 21 out of the 45, had writing on demand proficiency rates below 50 percent. Fewer than one in three students was proficient in writing on demand in eight of the 45 schools that self-awarded “Distinguished” grades to themselves. And, literally hundreds of elementary schools, 361 of them, got self-awarded Proficient scores for their Writing Program Review even though their students’ writing proficiency rates were below 50 percent, too.