Unbridled Learning’s ‘Program Reviews’ need reigning in
Thanks to massive numbers of way out of line, clearly inflated scores for the new Program Review element in the 2014 Kentucky Unbridled Learning school accountability program, the accountability program’s overall scores are coming into question.
From school superintendents to private citizens, I am seeing and hearing compelling evidence that the Program Reviews have a major credibility problem.
So, in honor of the football season, I am throwing a flag on the Program Reviews. In fact, the Program Review scores look so outrageous that I think it is probably time to eject them from the school accountability game altogether.
Program reviews were created when the Kentucky Legislature finally realized that the old KIRIS/CATS Writing Portfolio program was never going to work in assessment. The portfolio program received many audits. However, the inevitable, human nature problem of teachers awarding inflated scores to their students – scores that would later be used for school accountability – was never solved.
Legislators also figured out that using tests to evaluate the education of students on things like the arts, the humanities and whether or not students were getting good programs to prepare them for practical living and careers was not working, either.
So, instead of directly grading student performances in things like Writing Portfolios and Arts and Humanities, the idea that went into Unbridled Learning was to create an assessment element where reviews of each school’s program quality in those areas would be conducted.
It was a noble, but impractical, concept.
A key problem is the costs to hire outside auditors to conduct Program Reviews are prohibitive. So, it was decided that each school would self-grade its own programs. Then, those grades would be added into the state’s Unbridled Learning rating for the school. It was a wistful plan, but it didn’t mesh with the reality of human nature. In fact, well-established evidence from the old KIRIS/CATS Writing Portfolio program showed the scores would become inflated in ways no affordable amount of auditing would cure.
I started sounding alarms about the Unbridled Learning Program Reviews methodology a long time ago in the “Board of Education adopts fuzzier weighting for new state assessment program.” This blog item came out way back in June of 2011 when the Kentucky Board of Education decided on a plan to use teachers to score their own programs and to place significant weight for Program Reviews in the state’s Unbridled Learning school accountability program. I noted two problems.
First, the importance of math, reading, writing, science and social studies would be reduced in the accountability system once Program Reviews were added. Very poor performance in one or more of those key academic subjects could be hidden if teachers self-awarded high Program Review scores.
Second, the plan for staff in each school to do their own Program Review scoring obviously opened the door for the Program Review scores to become hopelessly inflated.
But, wistful thinking continued in Frankfort. Last year, after the first Program Review scores were publicly released, I wrote an additional series of blogs starting on November 1, 2013 with “Program review: Will this approach do?” These 2013 blogs provided disturbing evidence that even in their trial year, the Program Reviews already showed disturbing signs of inflation.
Those initial Program Review results didn’t actually count in Unbridled Learning 2013, however. They were only used as baseline information to develop a scoring scale for the 2014 accountability program. The true seriousness of the problem didn’t become evident until the 2014 Unbridled Learning results arrived. This year, Program Reviews accounted for almost a quarter of the entire scoring system, and the results look really questionable.
Furthermore, I am far from the only person who thinks Program Reviews have serious problems. On the day the Unbridled Learning results became public, October 3, 2014, Fayette County Schools superintendent Tom Shelton openly complained to the Lexington Herald-Leader that his schools suffered unfairly in the 2014 accountability scoring, implying his schools played fair with the Program Reviews while others did not. Said Shelton:
“…across the state, 64 percent of schools gave themselves perfect markings in program reviews. In Fayette County, only 25 percent of schools rated themselves high enough to receive the full 23 points.”
A few days later, on October 7, 2014 the Kentucky Board of Education began a discussion about possible changes needed to make Unbridled Learning a better program. One of the concerns raised is discussed in the board’s agenda item X attachment.(Note: To see the full listing of those Agenda Item X Unbridled Learning concerns, click here and then under Item X click on “Accountability Changes Attachment”)Specifically, the Board’s “Accountability Changes Attachment,” says in Item Number 20:
“Program Reviews* (Arts/Humanities, PL/CS, Writing, K-3, World Language) - Due to the subjective nature of the program review ratings, the program reviews should have reduced weight in the accountability model.”
Of special note, a survey of Kentucky’s school superintendents included in the Board’s “Accountability Changes Attachment” shows nearly half (48%) agree that Program Reviews are a concern. Superintendent Shelton has lots of company.
The Kentucky Board of Education was so concerned about this situation that they directed the Kentucky Department of Education to do a special analysis of how schools would perform if the Program Reviews were not included. That analysis is to be presented at the Board’s December 2014 meeting.
Now, this already is a lot disturbing evidence to chew on, but a call from BIPPS reader Jeff Weghorst added more. Jeff had seen Shelton’s comment that 64 percent of Kentucky’s schools self-awarded top points for their program reviews. Curious, Jeff examined an Unbridled Learning 2014 Excel spreadsheet, “ACCOUNTABILITY_SUMMARY,” from the Kentucky School Report Card 2013-2014 data sets that lists each school’s Program Review score. He said that Shelton was right on target.
I rechecked Jeff’s findings. Sure enough, among the 1,280 separate Kentucky school units that got Unbridled Learning scores in 2014, 820 – 64 percent – did indeed claim a perfect score for their Program Reviews.
Even more disturbing, an astonishing 1,064 schools – 83 percent – self-awarded at least 90 percent of the maximum 23 points that could be awarded for Program Reviews. In a sense, that amounts to 83 percent of Kentucky’s schools awarding themselves Program Review scores on a par with having their students score 90 percent or higher proficiency in a key academic subject like math or reading (high performances virtually no school in the state is currently posting).
I finally looked at one more piece of evidence. I compared the 2014 Writing Portfolio Program Review scores for each school that had full data to the proficiency rates that school achieved on the KPREP On-Demand Writing Tests. To put it mildly, I found very little correspondence between the Writing Program Review scores and the actual writing performance of students in Kentucky’s schools.
Note: For the technical types, here are the correlation results:
These are low correlations, indicating that little relationship exists between actual student writing performance and the obviously inflated scoring from the Writing Portfolio Program Reviews.
To sum up, there is very compelling evidence – not at all unexpected – that the Program Review scores must be viewed with considerable doubt. Furthermore, with Program Review’s counting for 23 percent of each school’s total Unbridled Learning score, the entire 2014 Unbridled Learning school accountability scoring must considered with considerable caution.
So, the penalty flag is down. The best outcome for this would be for the education coaches in Frankfort to realize the obvious: Program Reviews are a wonderful sounding idea, but they simply are not practical for a school accountability program. If Kentucky’s education coaches don’t face up to this obvious problem, the entire Unbridled Learning game could lose its fans very quickly.