Common Core fans get criticism, graph wrong

Hot: Update #1

It appears the Collaborative has heard from us. They already posted a correction for their error about the PARCC test being used in Kentucky. So far, however, they are mum about the other concerns I raise in this blog. This is a very strange situation!(Begin my original blog)My recent blog, “Where have all the school tests gone?” apparently hit some nerves at the Collaborative for Student Success, a well-known Common Core State Standards (CCSS) cheerleader. They just had to respond, apparently.

But, the Collaborative gets its comments wrong, starting with the article’s graph, which I annotated in the graphic below to make the deficiencies easier to understand.

graph-from-higher-standards-encourage-greater-accountability

graph-from-higher-standards-encourage-greater-accountability

The Collaborative’s graph starts right out with “Novice” performance (bottom possible score in Kentucky's assessment programs), including neither a title nor a vertical axis label. The year labels for the KPREP tests in the legend are also inconsistent.

After doing some research that included digging up the real scores, it turns out the graph shows the percentage of Kentucky students who scored Proficient or above on the National Assessment of Educational Progress (NAEP) Grade 4 Reading and Grade 8 Math Assessments in 2015. The graph also includes proficiency rate results (the combined percentage of students scoring Proficient and Distinguished) from Kentucky’s KPREP tests in 2013-14 and 2014-15 for those same grades and subjects.

But, aside from the labeling deficiencies, there is another interesting problem: the KPREP scores are listed backwards. While a casual examination of the graph would make you think the scores increased between 2013-14 and 2014-15, in fact the opposite is true. Kentucky’s KPREP scores for both Grade 4 reading and Grade 8 math DECLINED between those years. That isn’t exactly a ringing endorsement of Common Core.

I don’t know and won’t speculate about whether this was a conscious attempt to mislead, but it certainly isn’t good data presentation.

The graph does highlight something else that the Collaborative would probably not want to admit: Kentucky’s KPREP scores do look inflated compared to the NAEP. That doesn’t agree with the Collaborative blog’s closing comment that:

“States like Kentucky are headed in the right direction by setting expectations high and evaluating progress toward those goals.”

Data the Collaborative cares to share shows Kentucky headed in the opposite direction.

The original version of the Collaborative’s blog has other statements deserving of comment. Their blog says:

“Previously, Kentucky replaced its PARCC exam with the K-PREP.”

That is incorrect, as the Collaborative now admits in their update, as discussed at the top of this blog. Kentucky was part of both the PARCC and Smarter Balanced assessment efforts during the early development phase, but we dropped membership before either test came on line. Our own KPREP tests (the Collaborative refers to them as K-PREP tests, an obsolete annotation) came on line years ago in the 2011-12 school year. We never used PARCC for accountability, so we could never have "replaced" it.

The Collaborative also says:

“Innes’ suggestion that Kentucky’s testing policies are a scheme to inflate or cover up student performance is simply not defensible.”

First of all, I absolutely never said that Kentucky made the decisions that led to the dropping of EXPLORE, PLAN and COMPASS from our state’s Common Core era Unbridled Learning accountability system. Those actions were taken by ACT, Inc.

Kentucky’s educators wanted to continue using EXPLORE and PLAN, etc. They never got the choice.

Secondly, Kentucky certainly didn’t make the decision to extensively delay the NAEP Long Term Trend assessments, either. That decision came from Washington, DC.

To be honest, the Collaborative’s changing both the facts and what I said might be more in line with something not defensible.

Anyway, the Collaborative also says:

“…states that have pursued independent assessments to pacify critics have invited disruptions and costs, and have likely to end up with weaker tests….”

Well, there is plenty of interesting disruption within the PARCC and Smarter Balanced communities, too.

For one thing, many states have abandoned those assessments.

As of July 2015 Education Week reported that only 18 states remained with Smarter Balanced and just 10 stayed the course with PARCC.

In March 2016 Education Week updated its participation counts for PARCC and Smarter Balanced participation. For the 2016 testing cycle, only six states plus DC were staying with PARCC. Smarter Balanced participation had dropped to just 14 states. These still unproved assessments are being abandoned, rather quickly, across the nation. The notion that Kentucky would be able to compare its performance to a lot of other states using either test is fading fast, too.

Furthermore, even when states do use PARCC or Smarter Balanced, they still might manipulate scores. Washington state and Ohio apparently did that already.

So much for cross-state comparisons, at least for now.

To close, my bottom line remains unchanged: The Bluegrass State has lost a lot of trend lines from quality tests in its longest-in-the-nation Common Core era. That severely impacts Kentuckians’ real ability to know what is happening. While the testing demise problem is not due to policy decisions in Kentucky – despite comments from the Collaborative – the chaos in evaluating what Common Core is really producing for the Bluegrass State is a serious a problem.

Tech Notes:

NAEP scores are from the NAEP Data Explorer.

KPREP scores come from Kentucky School Report Cards “Data Sets” for ASSESSMENT_KPREP_GRADE Excel spreadsheets for 2013-14 and for 2014-15.