Claim: Kentucky Students With More Exposure To Common Core Standards Learn Faster – Really?
Do Kentucky’s students with more exposure to Common Core really learn faster?
This question arose thanks to a recent Bill and Melinda Gates Foundation funded report from the American Institutes for Research (AIR). That AIR report cites supposed “faster progress in learning” of students in Kentucky following the state’s changeover to the use of Common Core State Standards (CCSS), though the report hedges about making any definite claims that this is specifically due to Common Core.
Unfortunately, I can’t really dig into the AIR study. The AIR research team was granted full access to individually identified student test and demographic data covering a large number of Kentucky students. In the interests of student privacy, ordinary citizens are not allowed to see such individually identified data.
As things turned out, lack of access to the full data doesn’t mean I can’t raise some interesting questions about the AIR report.
To start, everyone has online access to overall statewide average ACT scores for Kentucky’s 11th grade students. These overall scores are available for a number of years both leading up to and during the early implementation of Common Core in the state. Some of that data is collected in Table 1.
Table 1
In Table 1 data for years before Common Core had much, if any, real impact at the classroom level in Kentucky are shaded in blue. The years with yellow shading include the three-year period where Common Core based tests have been in use in Kentucky.
Note that during the pre-CCSS years between 2009 and 2011, the average “all student” ACT Composite Score for Kentucky’s 11th grade students increased by 0.6 point on ACT’s 36-point scale. In fact, in both of the one-year transitions from 2009 to 2010 and from 2010 to 2011, the ACT Composite Score experienced the same rate of improvement with a 0.3-point rise per year.
The rate of progress clearly slowed once Common Core started. During the first three Common Core years the annual rate of improvement has only been 0.2-point-per-year. Between 2012 and 2014 the ACT Composite Score increase only totaled 0.4 point.
Clearly, looking at the overall ACT Composite Scores from Kentucky’s 11th grade testing program, the rate of progress slowed down once Common Core came along.
So, Kentuckians now face a puzzle. The data in Table 1 above say that Kentucky education’s rate of progress performed quite differently from what the AIR study shows. So, what is really happening? You would need access to the identified student data to begin answering that question.
Regardless, Kentucky’s current ACT performance level certainly isn’t good enough. As I wrote back in December, a recent study from the Kentucky Office of Education Accountability shows only 37 percent of Kentucky’s 2014 high school graduates were actually college ready based on their ACT performance. If our ACT improvement is actually starting to slow after Common Core’s introduction, we need to find out why, because Kentucky’s education system clearly has a very long way to go.
The new report from the American Institutes for Research says it is intended to start a discussion. I think it will succeed in that, though possibly not in ways the report’s authors intended.
For example, AIR says it actually tracked progress for individual Kentucky students between the eighth grade and the 11th grade using two different test products. Eighth grade starting performance was evaluated using Kentucky Core Content Tests (KCCT). The KCCT, which are currently being phased out (and are totally gone for Common Core subjects of reading, math and writing), were created years ago as part of Kentucky’s now completely defunct CATS program.
The AIR study dubiously compares that eighth grade KCCT performance to how students did three years later in Kentucky’s 11th grade testing with the ACT college entrance test. The ACT is a totally different product that shares little, if any, alignment to the KCCT. The KCCT certainly is not aligned to what students need for college and careers (hence the KCCT phase out). AIR’s report does not explain why the research team chose to compare results across two such very different tests.
Also unmentioned by the AIR report, a much better eighth grade test option was readily available in Kentucky. In addition to using both the ACT and the PLAN tests in high school (the AIR report does mention PLAN), Kentucky also used the ACT’s EXPLORE tests in the eighth grade throughout the entire time of interest in the AIR study. EXPLORE is linked to the ACT and covers the same academic subjects. It probably would provide a MUCH more suitable and consistent baseline.
Aside from the problematic cross-test comparisons, other elements of the AIR study stimulate more questions.
Page 15 of the new AIR report discusses supposed eligibility rates for the Free and Reduced Price School Lunches (FRPL) by year for students in the study. The FRPL rates are often used as a measure of student poverty. The problem here is the rates cited by AIR don’t agree with published testing information for Kentucky, as Table 2 shows. Table 2 examines data reported with the testing results from 2010-11 from the old KCCT 11th Grade math exam and from the more recent 2011-12 and 2012-13 ACT exams for Kentucky (the lunch data were not reported to the public in the Kentucky Grade 11 ACT testing program before the 2012 year). The lunch rates I calculated for the 11th grade students from actual student counts in Kentucky testing reports notably disagree with those cited by AIR for both the 2011-12 and the 2012-13 test years.
Table 2
In fact, the five-point lunch rate difference for 2011-12 and the six-point difference in the rates for the 2012-13 year are large enough to raise questions either about the data supplied to AIR or possible handling problems with that data once AIR had access.
In any event, the supposed large change in FRPL rates in Kentucky from 48 percent to 56 percent between 2010-11 and 2012-13 cited in the AIR report is not supported by actual student demographic data publicly available online. This impacts some of the discussion in the AIR report about a supposed notable increase in student poverty rates over the study period.
There are other technical issues with the AIR comments, as well.
Page 6 of the AIR report claims that the Common Core State Standards were adopted in Kentucky in June of 2010. That is incorrect. In a highly publicized, televised media event on February 10, 2010 – as covered by an official news release from the Kentucky Department of Education – the Kentucky Board of Education committed to the CCSS, sight unseen. The final CCSS in fact were not published until June 2010. Adoption of Common Core State Standards sight unseen has been contentious in Kentucky and in many other states where similar rather hasty adoptions also occurred, all triggered by a near-panic quest for Race to the Top money.
A strange technical error in the description of the old KCCT tests is found on Page 11 in the AIR study. There it says:
“The KCCT score vector includes student scores in all four tested subject areas: English, mathematics, social studies, and writing scores.”
To put this directly, there never was something called an “English” test in the KCCT. The KCCT also included more than four tested areas: reading, math, science, social studies and writing along with other testing in arts and humanities and practical living and vocational studies.
The AIR report does not describe the KCCT correctly, raising questions about what data actually was examined by the research team. Why would the KCCT vector include the non-CCSS subject of social studies but omit the other non-core subject of science?
More concerns about AIR’s understanding of Kentucky's education data arise from comments beginning on Page 12. On this page the AIR study says:
“First, when Kentucky implemented the new state standards, it decided to adopt revised, CCSS-aligned curriculum framework for English and mathematics (“targeted subjects”) but carried over the reading and science (“untargeted subjects”) curricula from the old regime.”
The implication in these comments and the following analysis in the report indicate that the AIR team talked themselves into believing that full adoption in Kentucky of the Common Core State Standards for English Language Arts somehow left Kentucky’s old reading program essentially un-impacted. Apparently, the researchers came to this conclusion after learning somewhere than an obsolescent curriculum framework document for reading had not been changed.
That betrays a misunderstanding of the laws in Kentucky regarding curriculum. In fact, frameworks from the Kentucky Department of Education on curriculum are largely unimportant. The decision authority on the curriculum taught in each school is strongly held by each school’s School Based Decision Making Council (SBDM). This has been the law of the state since passage of the Kentucky Education Reform Act of 1990 and the subsequent activation of such SBDMs across the commonwealth.
What really shapes curriculum in Kentucky currently is what the SBDMs see in the Common Core State Standards and – perhaps even more importantly – what the SBDMs believe will appear on state testing as a result of those standards.
Without question, reading assessments in Kentucky changed as a result of Common Core. According to the statewide Kentucky Interim Performance Report for 2011, reading proficiency in Kentucky’s high schools was 65.90 percent based on the old KCCT then still in use. One year later, the Delivery Targets section of the 2011-12 School Report Card for the state shows the high school reading proficiency rate had dropped notably to only 52.2 percent based on new testing products introduced for Common Core.
Overall, it is incredible that the AIR report somehow concluded that reading was not impacted by the implementation of Common Core. The fact is that Kentucky adopted the full Common Core State Standards in English Language Arts and Mathematics verbatim and there certainly have been impacts, not the least of which includes testing changes.
To wrap this up, the AIR report does say it is only a discussion paper. It backs away from making any assertions that Common Core is definitely responsible for the supposed improved student performance. It certainly has started a discussion.
But, the discussion shouldn’t include the large number of technical issues mentioned above. Certainly, we should not be in a position of having to question the fundamental premise of the report that education is improving more rapidly now that Common Core has come along.
Also unresolved is a potentially much larger issue. Financed by a private philanthropy, not the state of Kentucky, AIR – an independent research organization totally beyond control of anyone in Kentucky – was granted extensive access to individually identifiable student data from the Bluegrass State. What sorts of provisions for data security were made? Is AIR to destroy its copy of the data now that the report is completed? Does AIR have the right to share the data with third parties? If so, how is appropriateness of that access to be assured? Who owns the data in AIR’s possession? The AIR report is completely silent on these major issues, and this could prove to be a far bigger concern than whether or not student performance has really been much impacted by a changeover to Common Core State Standards in Kentucky.
Data Sources:
Table 1 Data
ACT Composite Scores from 2009 through 2011 are from a single EXCEL file from the Kentucky Department of Education, online here.
ACT scores for later years come from individual year statewide School Report Cards available from this link.
Table 2 Data
Student counts from 2010-11 from KCCT Grade 11 Math data in Interim Performance Report, State, for 2010-11.ACT Counts from 2011-12 and 2012-13 from School Report Card, state for listed year.