Another report inflates Kentucky’s education progress
A new report from Education Sector Reports, “The New State Achievement Gap: How Federal Waivers Could Make It Worse—Or Better,” makes some pretty glowing comments about Kentucky’s educational progress between 2003 and 2011 on the National Assessment of Educational Progress (NAEP). Naturally, without pausing for any reflection, the Kentucky Department of Education (KDE) jumped all over this with a news release.
The KDE news release says:
“The study compares student gains on the National Assessment of Educational Progress (NAEP) from 2003-11 in grades 4 and 8, reading and math for all students. Kentucky students recorded an overall gain of 28.2 points, which translates into not quite three-fourths of a year of improved achievement per assessment from 2003-11.”
The 28.2 point gain claim got me scratching my head; so I took a deeper look. I knew none of our NAEP score gains in any subject were nearly that good.
I quickly learned that Education Sector’s score reporting is confusing.
Let me explain.
This table shows the changes that occurred in NAEP math and reading scale scores between 2003 and 2011 for Kentucky and the overall average changes for that period for all states plus Washington, DC.
Education Sector generated their 28-point improvement number for Kentucky by adding the individual improvement numbers for each subject together. That is VERY different from what many people think of as a “composite” score. For example, on the ACT college entrance test, the composite score is an average across all the subjects tested. Using the ACT definition, Kentucky’s improvement in NAEP between 2003 and 2011 would only be the average of 12 plus 6 plus 7 plus 3, or just 7 points.
But, there are more serious problems here. The Education Sector’s approach hides those problems.
Notice that changes in individual subject scores have not been that large either in Kentucky or across the states. Furthermore, while Kentucky has made a more progress than the other states in its elementary schools, there is no discernable difference in performance changes for Kentucky versus the other states at the eighth grade level.
This very different, very important picture gets hidden when you lump everything together simplistically as Education Sector just did.In fact, even the simplistic review of the overall student scores above shows whatever gains Kentucky made on NAEP compared to other states at the elementary school level do not persist into the middle school years.
There is a lesson here, by the way, for our new Unbridled Learning state accountability system. Unbridled Learning’s final judgment of each school and school district is also based on an overall average of all scores. Just like the eighth grade surprise in the table above – which clearly is very different from the implications in the Education Sector’s NAEP analysis – Unbridled Learning’s lump-everything-together approach can hide major problems and performance issues, as well.
For example, minority students can still be left behind while schools generate overall improving scores.
We’ll be keeping an eye out for that as more K-PREP testing data becomes available.
There is more to the NAEP/Education Sector story. The rosy EdSector picture starts to fall apart as soon as you disaggregate the NAEP data by race. You can learn more about that by clicking on “Read More.”
Ed Sector ignores some basic recommendations for doing NAEP state analysis
EdSector misses problems in Kentucky because it largely ignores recommended policies found in recent NAEP report cards for doing state to state comparisons.
For one thing, Ed Sector’s report totally ignores Kentucky’s nation-leading exclusion rates for learning disabled students on all NAEP reading tests, which inflates the Bluegrass State’s reading performance compared to other states. There is no consensus on how much inflation is created by this exclusion, but there is general agreement that it is an issue and at least should be mentioned.
Education sector also ignores the huge differences in the racial makeup of public school enrollment from state to state, giving Kentucky another unfair advantage in Ed Sector’s simplistic comparison of overall average scores for all students.
Ed Sector’s only state to state analysis besides scores for all students is for students eligible for the federal free and reduced cost lunch program. That is still a problem. The analysis winds up comparing poor whites in Kentucky to poor minority students elsewhere, which still gives Kentucky an unfair advantage. Also, many poor students elsewhere are still learning English, while Kentucky has one of the lowest percentages of English language learners of any state in the nation, making the EdSector comparison even more of an apples to oranges issue.
Finally, Ed Sector ignores statistical sampling errors in the NAEP scores. Those errors create LOTS of score ties when the raw scale scores seem to indicate clear winners and losers. That is especially true if we try to look at data for student subgroups in the NAEP.
Here’s a bit better analysis
Since the table above shows whatever gains we make in the fourth grade are largely lost by the eighth grade, I’m going to concentrate on the eighth grade here. After all, what comes out the top of the school system is ultimately what most concerns us.
This next table shows how Kentucky’s white students, who comprised 84 percent of public school enrollment here in 2011, compare to their counterparts in other states. I created this table using standard statistical test tools in the NAEP Data Explorer.
Pay particular attention to the column titled “Cross Jurisdiction Significant Difference.” States with green shading in this column outscored Kentucky by a statistically significant amount. States with yellow tied us. States with pink shading in this column scored below Kentucky by a statistically significant amount.
Notice that, once you allow for the statistical sampling errors in NAEP scores (which are measured by the Standard Error of each score, as shown in the “SE” column), the best the NAEP can tell us about how our dominant racial group performed in eighth grade reading is that we were outscored by 9 states (“Higher” column), tied 34, and only bested 7 states. It’s hard to make a statistically valid case for lots of improvement in Kentucky if that is where we actually performed as of 2011.By the way, I ran a similar analysis that only considered eighth grade whites who were eligible for free and reduced cost lunch in 2011 (a poverty measure). Thanks to larger statistical sampling errors, the results are very blurred. Kentucky’s lunch-eligible whites did outscore their counterparts in 7 other states by a statistically significant amount; but, after the increased sampling error in this rather small group of students in the NAEP sample is considered, we tied all the other 42 states (Washington, DC, didn’t get useable scores in 2011).You can’t make much of a case from that.
And, I must reiterate, even these tables provide an inflated picture for Kentucky because we excluded far more learning disabled students in reading than was typical across other states. If there was a way to correct for the exclusion differences, our results would look somewhat worse.
Now, let’s examine NAEP Grade 8 Math for whites.
This next table tells the very sad tale.
How’s that again about all the nation-leading progress? In truth, this table shows that as soon as we start doing a more apples to apples comparison of NAEP data – whites against whites – as of 2011 Kentucky is still very near the bottom of the heap for its dominant racial group. Thirty nine states and even Washington, DC, amazingly, outscored us by a statistically significant amount.
By the way, I also ran the Data Explorer’s significance test for only those whites eligible for the lunch program. In NAEP Grade 8 Mathematics, Kentucky’s lunch-eligible whites only scored statistically significantly higher than their counterparts in just two other states. We tied 21 states and were outscored by poor whites in 26 states.
So, is Kentucky’s public education system among the nation’s “most rapidly improving”? Well, it doesn’t seem like there has been much improvement in general, and even if the state did eek out a few extra points, it’s not really much to crow about.
It also is unfortunate that even after the development of the great NAEP Data Explorer Tool, we are still getting inundated by unsophisticated and potentially very misleading analyses by all sorts of think tank groups. So, it looks like I will have a long-time job of providing what the late Paul Harvey used to call “The rest of the story” as such “stuff” continues to come out. But, to be honest, I’d rather have people get the story right the first time, instead.