No, most Kentucky KPREP scores are not that close to NAEP scores
The Kentucky Department of Education sent out News Release 15-051 yesterday, which contains some inaccurate information about how the state’s KPREP test compares to the National Assessment of Educational Progress (NAEP). While the release claims “Kentucky Among Handful of States with Reliable Test Scores,” that might be stretching things.
To begin, the term “Reliable” has specific meaning when we are talking about tests. This gets technical, but you can read about the formal meaning of test reliability here if you want. In any event – unless I missed something – the news release does not seem to refer to any formal determination of test reliability for Kentucky’s Kentucky Performance Rating for Educational Progress tests (KPREP). In fact, I don’t know if any formal reliability studies have been completed for KPREP. Perhaps the media staff at the department just made an unfortunate selection of terms, but the news release needs a correction.
There are more problems because claims in this release are math-challenged.
For example, someone solved their Common Core school year math line problem wrong. The news release talks about a comparison of NAEP and KPREP scores for the 2013-14 school term. Well, that’s wrong. The most recently available State NAEP math and reading results are for the mid-to-late winter administration in early 2013. That was during the 2012-13 school year. If someone compared 2013-14 KPREP scores to 2013 NAEP scores, that was a bit apples to oranges because different students would have taken the different tests. Furthermore, it is unnecessary to do such a cross-year comparison because we have 2013 KPREP results for the same Kentucky fourth and eighth grade students who took the NAEP in 2013.
There is another, more involved math error, as well. It has to do with this quote from Kentucky Commissioner of Education Terry Holliday in the news release:
“The report verifies the increased rigor of our assessments; statistically, we are well within NAEP standard error of measurement.”
Let’s explore the problem with the commissioner’s statement.
The NAEP is a sampled assessment, so there are plus and minus errors in the scores just like we hear about all the time with sampled polls of likely voting patterns in elections. However, the people who create the NAEP provide information on how to calculate those plus and minus errors. I used that information to find out if the commissioner was right in claiming Kentucky’s KPREP math and reading scores for 2013 fell within the plus and minus error range for the NAEP.
This simplified table shows what I found.
Table 1
In almost every case, even with the maximum likely error correction added to the NAEP scores, the difference in the proficiency rates from that federal test and the KPREP is on the order of 10 percentage points. To be sure, that is much better than we had with the old CATS tests, but parents still need to be wary if their student’s scores are only a little above the KPREP proficiency minimum. That could indicate that their child is going to have problems in college.
By the way, parents of high school students probably have things a bit better. Under KPREP, high school students currently are tested with products developed by the ACT, Inc. So, the results should be on target with the actual ACT test. But, the NAEP indicates this is small consolation for parents of younger students, some of whom may not really be on track although they are getting Proficient scores in KPREP.
To be clear, Kentucky’s KPREP is certainly more rigorous than the old CATS assessments. But, comparisons to the NAEP show KPREP may not be rigorous enough. It may not be safe to assume your child is on target just because he or she gets a proficient score in KPREP. And, most KPREP tests are not as rigorous as the NAEP.
For those who like details, here are the data I used to compute the highest probable NAEP score based on the published Standard Errors in the scores.
Table 2
The National Center for Education Statistics tells us:
“…one can conclude with approximately a 95 percent level of confidence that the average performance of the entire population of interest (e.g., all fourth-grade students in public schools in a jurisdiction) is within plus or minus two standard errors of the sample average.”
Let’s decode that with an example. Because the published percentage of students at or above Proficient on NAEP Grade 4 reading is 36 percent and the Standard Error in that score is 1.7, we can be reasonably certain that the true score for all the students won’t be more than 36 + 1.7 + 1.7, or 39.4.
But, the KPREP proficiency rate reported for this same group of fourth graders in 2013 was notably higher at 48.8 percent. Even after we allow for the maximum likely scoring error in NAEP, the commissioner’s assertion that the KPREP score falls within that error is clearly incorrect.
In fact, only in the case of fourth grade math is the commissioner’s statement essentially correct. For all the other subject/grade combinations in the table, he is not correct.
Some final thoughts
While Kentucky did take the lead to develop new and more rigorous state tests after Common Core came along, most other states have now adopted new tests, as well. It is possible that those other states now have even more rigorous tests than Kentucky. Looking at history from 2013 was a snapshot in time of a changing test landscape, and Kentucky’s apparent lead could be short-lived. It is just too early to tell.
Moving forward, the NAEP math and reading assessments were administered again in early 2015 during the current 2014-15 school term. I expect to see results sometime around the end of the calendar year. By that time, we will also have another year of KPREP scores to examine for trends, too, and this is when we will really start to see what KPREP is made of.
Data Sources:
All NAEP data comes from the online NAEP Data Explorer tool.
The KPREP scores come from the Kentucky School Report Cards.