Kentucky state researchers: Don’t use College and Career Readiness Rates as Primary Indicators

Has Common Core implications

Backing up reservations we have been talking about for some time, researchers from the Kentucky Legislative Research Commission’s Office of Education Accountability (OEA) made their opinion of Kentucky’s new College and Career Ready (CCR) statistics very clear in their briefing today. This slide (Figure 1) from their Power Point presentation to the Kentucky Legislature’s Education Assessment and Accountability Review Subcommittee spells it out:

Figure 1

Recommendation 2.4 - Don't use CCR for Evaluation of Programs

Recommendation 2.4 - Don't use CCR for Evaluation of Programs

To reiterate, OEA’s new report, “A Look Inside Kentucky’s College and Career Readiness Data,” says educators should not use Kentucky’s new College and Career Ready (CCR) statistics as the sole or primary indicator when reporting on student outcomes or evaluating programs and policies.

There are a number of reasons for the OEA recommendation. Some include:

• Inconsistent calculation of the CCR statistic over the years from 2010 to 2014, because new career ready criteria didn’t exist before 2012,• Test security issues that could impact the CCR results,

• Variable difficulty among various tests used to show students are ready,

• Possibility of inflation on one test used to determine readiness due to use of special calculators, now banned for future testing, and

• The possibility that schools with similar CCR rates may actually have significantly different academic performance.

OEA Recommendation 2.4 has direct implications for claims citing Kentucky’s CCR performance as supposed evidence of the success of the Common Core State Standards in the Bluegrass State. Quite simply, the CCR statistics currently have too many problems to support such claims.

Here is more information on some of the issues OEA raised in their briefing.

• Key problem: The CCR calculation has not been consistently computed over the period from 2010 to 2014. In both 2010 and 2011 success on the ACT College Entrance Test was the only way to be counted as ready. In 2012 several additional ways for students to qualify as college ready were added along with a whole new set of criteria for a student to be counted as career ready. Thus, comparison of the data from 2010 and 2011 to later data simply is not appropriate.

• New ways students can count as college ready involve two college placement tests, the ACT Compass and a Kentucky-developed test called KYOTE. Even if a student fails to score high enough on all three of the ACT college entrance test elements in English, math and reading, the student can make up any deficiencies on a subject by subject basis with either the Compass or KYOTE. However, data presented in the OEA’s new report strongly indicate that neither the Compass nor KYOTE are as demanding as the ACT. Figure 3.G in the OEA’s report draft that was approved today shows freshman grade point averages against the type of test used to prove they were college ready.

Figure 2 (From Figure 3.G in OEA's Report)

PCT of 2012 High School Grads by College GPA and College Ready Measure

PCT of 2012 High School Grads by College GPA and College Ready Measure

Notice on the left side of the graph that high school graduates in 2012 who initially passed all three ACT subjects with adequate scores in the statewide 11th grade testing did the best in college (the “ACT 11th Grade” segment). A total of 57 percent of those successful on the ACT in the 11th grade achieved a GPA of 3.0 or more as of 2013. Only 18 percent of these students are in academic trouble with a GPA below 2.0.In sharp comparison, Kentucky’s high school graduates who only qualified as college ready through the Compass or KYOTE test (the “Compass/KYOTE Only” segment) performed much more poorly once actually in college. Only 21 percent had a GPA above 3.0 and nearly half (46 percent) were in academic trouble with a GPA below 2.0. So, while Compass or KYOTE might allow a student to bypass remedial courses in a Kentucky public college, these tests do not really indicate college readiness with the same level of accuracy as the ACT college entrance test.

• The OEA pointed out that it isn’t a bad idea to use the Compass and KYOTE tests. However, inclusion of results from KYOTE and Compass as evidence of readiness does make it more difficult to accurately judge true school versus school performance because one school might get high overall CCR marks directly from the ACT while another could get a similar CCR rating and overall accountability score even though most students in that second school only pass muster with the less demanding exams.

• The OEA is concerned that security is not as strong with alternate testing (an issue recently highlighted by teacher misconduct on Compass testing in Louisville’s Male High School). OEA placed a specific recommendation in their report (Recommendation 2.2) that calls for identification of unusual patterns in CCR data that might indicate inappropriate testing activity in a school.

• Much of the increase in the CCR rate comes from alternative testing, not the ACT. This next graph (Figure 3) is taken from the OEA’s Power Point presentation (Some color enhancement added for clarity and a few missing numbers added from the report draft).

Figure 3

PCT of Grads CCR on ACT Only, On Combination of ACT, Compass or KYOTE, or Career Ready Only, 2010 to 2014

PCT of Grads CCR on ACT Only, On Combination of ACT, Compass or KYOTE, or Career Ready Only, 2010 to 2014

Figure 3 shows the lion’s share of the increase in the overall CCR rate comes from the alternative KYOTE and Compass testing and the new career ready areas. Between 2010 and 2014, there was only a 7-point improvement in the percentage of high school grads who met the ACT college readiness benchmark scores set by the Kentucky Council on Postsecondary Education in English, math and reading. In sharp contrast, from not even counting in 2010 and 2011, the KYOTE/Compass and Career Ready elements provided a much larger increase of 25-points by 2014.

• Figure 3 also highlights another potential problem from the OEA briefing. As we mentioned in earlier blogs, there is concern that the Compass test results in math have been inflated due to use of special programmable calculators loaded with the “Zoom” application. Research by Professor Steve Newman at Northern Kentucky University established that students who knew very little algebra could get a passing score on Compass with these calculators. That research was later confirmed by the Kentucky Department of Education after the 2014 testing cycle was completed. Going forward, such calculator use is prohibited by both the Kentucky Department of Education and by the ACT, Inc..

However, possible inflation in Compass scores through 2014 cannot be dismissed and apparently cannot be fully determined, either. The OEA didn’t mention a possible amount of score impact. Still, Table 1 below (taken from Table 2.3 in the approved draft version of the OEA's report) shows the great majority of students who met college readiness in math and/or reading with a test other than the ACT did so predominantly with the Compass rather than the KYOTE (KYOTE does not test English readiness).

Table 1

Number of Grads CCR by Test Type

Number of Grads CCR by Test Type

This is relevant to the Zoom Math calculator issue. Table 1 shows a large number of students qualified as college ready in math using the Compass test (9,460 students in 2014). Thus, inflated math results from the Compass could have a notable inflationary impact on the overall CCR statistics through 2014. The true picture of the impact from Zoom Math calculators probably won’t become better understood until after the 2015 test results are available.

In closing, I want to make it clear that Kentucky should develop a solid tracking system for college and career readiness using valid and reliable tests and other products. I think we are heading in a good direction, but the current statistic obviously has a lot of growing to do; so, as the OEA pointed out today, it isn’t ready now for use to judge important programs and policies, and that includes forming performance judgments about the Common Core State Standards.

(Note: Minor edits, Table 1 and related discussion, and link to approved draft version of the OEA report added December 20, 2014)