Are there issues with the 2017 NAEP’s validity?
As I discussed earlier today, some folks have been raising concerns about the accuracy of the 2017 National Assessment of Educational Progress (NAEP), for which scores just released to the public today.
The concerns gained added weight several weeks ago when the Louisiana superintendent of education wrote a letter outlining a contention that NAEP’s switch to digital testing in 2017 might have impacted different states in different ways.
Kentucky’s commissioner of education Stephen Pruitt weighed in today with similar concerns in the Kentucky Department of Education’s news release about the new NAEP scores, saying:
“Our students, and those in a handful of other states that still give paper and pencil state tests seemed to be at a disadvantage with the new online NAEP assessments,” Pruitt said. “It is an entirely different experience taking a test on a tablet than with the paper and pencil our kids are used to. Going digital seemed to have an impact on results especially in reading.” The validity of the NAEP is an important deal, but it will be some time before independent experts have a long enough look at the 2017 NAEP data to really determine if the switch to digital testing caused problems.
However, I can show you now the 2015 to 2017 trend lines from NAEP and Kentucky’s state school assessments called KPREP.
This table tells that tale.
To be honest, based on Kentucky’s own testing, the NAEP might or might not have issues, and it could vary by grade and subject tested.
Note in the green section of the table that the change in proficiency rates on NAEP and KPREP match about as closely as possible for Grade 4 reading. The match is also remarkably close for Grade 4 math, shown in the gray section of the table.
But, at first look the eighth-grade trends don’t follow each other well.
The reported NAEP and KPREP proficiency rates actually moved in opposite directions by a notable amount for Grade 8 reading.
However, all NAEP results have statistical sampling errors, in this case running about plus or minus 3 percentage points if we want to be 95 percent confident that the true scores are within that range from the published scores. The true 2015 Grade 8 Reading proficiency rate could actually 34 and the 2017 rate could actually 36. If so, the difference in the score trend with KPREP would be pretty much eliminated.
Likewise, for Grade 8 NAEP math, if the 2015 rate was actually 27 and the 2017 rate was 31, then the difference in the trends would not be notable, either.
Thanks to the uncertainty in NAEP scores due to the sampling errors, both of these possibilities for the eighth-grade data cannot be ruled out.
So, the bottom line here is that I really can’t say if NAEP had problems in 2017 based on Kentucky’s own assessment results. The NAEP’s measurement errors are too large to allow such determinations, but at the same time it is clear that if any real progress was made in Kentucky, it was very small.
After 27 years of KERA and a half decade of Common Core aligned math and reading assessments, even if NAEP is wrong, Kentucky has made little progress in education recently.
And, the idea that our students’ proficiency rates in 2017 were as low as 29 percent for Grade 8 math and 34 percent for Grade 8 reading is not very appealing, either. With more than a quarter of a century of costly KERA education reform efforts behind us, these current numbers, even if they might be off by as much as 3 points from reality, are neither comforting nor acceptable.