Program review: Will this approach do?

Yesterday the Kentucky Department of Education released results from the first round of Program Reviews in “Arts and Humanities,” “Practical Living & Career Studies” and “Writing.” These reviews are going to be used in place of tests in these areas in the new Unbridled Learning school accountability program.

But, there seems to be a problem. You see, just like the writing portfolios in our old CATS program, schools self-grade their own program reviews. That obviously opens the door wide, again, for the same problem of inflation that was a constant headache with the old CATS writing portfolios. Every time the CATS writing portfolio scores were audited, the results showed a significant number of schools had inflated their scores.

I ran an analysis comparing the high school Program Review for Writing results to the combined percentage of students who scored “Proficient” or “Distinguished” on the 2012-2013 KPREP “Language Mechanics” test. While the process of writing certainly includes more than language mechanics, I would think a program that has low scores for the mechanics of writing would not be worthy of a high Writing Program Review score.

Sadly, just as I expected from our history with CATS, there are problems.

The table below shows the data for standard (A1) Kentucky high schools with the highest and lowest KPREP Language Mechanics scores in 2013. The Program Review data come from the Open House link found in the new press release about the program reviews.

The KPREP Language Mechanics data come from the “Level” KPREP Excel spreadsheet found in the Kentucky School Report Card’s “Data Sets.”

To make it a little easier to compare the results, I converted the original zero to 12-point scores from the Program Reviews into a zero to 100-point scale.

Writing Prog Review Vs. KPREP 2013

Writing Prog Review Vs. KPREP 2013

There are some disturbing surprises in this data.

Note that Beechwood High School’s staff graded itself very rigorously on its writing program review. As a result, this school got a classification of “Needs Improvement” for its writing program. Never the less, Beechwood’s students achieved the highest proficiency rate in the state (86.0 percent) for the Language Mechanics testing in KPREP.

Note that right behind Beechwood we find other top Kentucky high schools like the selective Dupont Manual and the Brown School in Louisville. Staff in those schools also graded their own writing programs low enough to get a “Needs Improvement” flag, as well.

In fact, only one of the top five schools even scored “Proficient” on the Writing Program Review. None of these top schools got a “Distinguished” score.

By the way, while I didn’t set up a separate spreadsheet to look at this in detail, most of the schools listed among the top five for KPREP Language Mechanics in the table above also scored at the very top for Grade 11 ACT English Benchmark Score performance in 2013, as well (Find the ACT data in the “ACT” section of the Kentucky School Report Card’s “Data Sets”).

The lone exception, Paintsville High, still came in way above average, ranking at number 42 out of 231 high schools that had ACT English results reported.

Now, drop down in the table to the lowest five performing schools on the KPREP Language Mechanics tests.

Staff members in these schools obviously think very highly of their writing programs. Three of the five scored themselves “Proficient” or even “Distinguished.” The fly in the ointment here is that these schools got the very lowest scores on the KPREP Language Mechanics assessment. Their Writing Program Review results just won’t fly.

By the way, the schools shown at the bottom of the table above also scored at the bottom on the ACT English tests in 2013, as well.

So, even my brief examination shows there is good reason for concern about very uneven scoring in the new Writing Performance Reviews. And, strangely, we can even see huge differences within the same school district – Jefferson County.

My initial take is that, just like with the old CATS writing portfolios, teachers in some Kentucky schools will ALWAYS inflate their scores on anything used for assessment. It’s human nature. So, if the program reviews are going to work properly, the state must come up with a way to have independent and impartial persons conduct the reviews. Otherwise, this program will inflate overall Unbridled Learning accountability scores in too many schools.

Some serious closing thoughts: the press release says that the five planned program reviews will eventually count for 23 percent of the overall school score in Unbridled Learning. That heavy weight makes the potential for serious inflation in this area far too important to ignore. Also, the department of education is planning to use the results from this first round of program reviews to develop new baseline scoring for next year. I would suggest they better do some serious auditing of the results, first.