The Bluegrass Institute for Public Policy Solutions

View Original

2016 ACT: KY Public and Non-Public Scores

Our Wednesday, August 24, 2016 blogs only cover overall average scores for all students from public, private and home schools combined. Today, we dig deeper into the data, looking at performance over time separately for Kentucky’s overall student scores, public school only scores, and non-public school only scores (including both private and home school students together).

That last item, the non-public school scores, is something you won’t find elsewhere. We do the algebra for you to compute non-public ACT scores from the known scores for all students (which come directly from the ACT) and the scores for Kentucky’s public school students only (which come from the Kentucky Department of Education). It’s a Bluegrass Institute exclusive, so sit back and check out how Kentucky’s various school populations compare on this important college entrance test.

Just click the “Read more” link.

Table 1 shows how Kentucky’s 2016 public, private and homeschools graduates performed overall. Let’s see what this table tells us.

Table 1

Kentucky All Student Scores by Year

To begin we need to note that there have been two important changes that impact ACT score reports for Kentucky over the years listed in Table 1. Between 1993 and 2008, the ACT test was not given to all students in Kentucky. Taking the test back then was voluntary on the part of the students. Since students had to pay for the ACT themselves, it is obvious that those who took part wanted to go on to college.

In the years before 2009, the percentage of Kentucky graduates that did take the ACT varied. For example, in 2000, ACT reports show 71 percent of Kentucky’s high school graduates took the assessment. In 2005 that rose to 76 percent. In 2008, the last year before Kentucky instituted 100 percent testing of all public school graduates, the Bluegrass State’s overall ACT participation rate was 72 percent.

Things changed in 2009. The Kentucky graduating class of 2009 was the first group where all public school graduates took the ACT at least once, at state expense, as 11th graders. As you can see, the shift brought a dramatic reduction in the scores because many students who probably were not interested in going to college were suddenly added to the testing pool. Kentucky’s ACT Composite Score, for example, dropped by 1.5 points from 20.9 to 19.4 between 2008 and 2009, a typical drop found when other states move to test all students with the ACT.

Kentucky’s ACT picture changed again in 2013 when the ACT, Inc., which administers this assessment, changed the way it reported scores. Prior to 2013, publicly released scores from ACT only included students who took the assessment under standard time limit conditions. Beginning in 2013, almost all of the scores in reports from the ACT included all students who got scores the ACT was willing to report to colleges. This included for the first time some students with disabilities who took the test under specified extended time conditions.

This 2013 shift in reporting also impacted scores a bit. Kentucky’s ACT Composite Score dropped by 0.2 point between 2012 and 2013, and – as Table 2 shows – similar impacts appeared in other states that tested 100% of their graduates in both of those years, too.

Table 2

Change in ACT Composite for 100 PCT Testing States 2012 to 2013

So, the message here is that we generally should not compare scores across those change years separated by the white horizontal bars in Table 1. That still leaves us with several years of recent trend data from 2013 and 2016 that we can examine.

Over this four-year period Kentucky has seen some growth in its scores, but between last year and this year that trend reversed for both math and science, so the ACT Composite Score stayed flat even though English and reading scores continued to climb. That uneven performance across subjects raises concerns, but a one-year change in scores isn’t really a trend. For now, we just need to monitor what happens going forward.

Next, take a look at the public school only performance in Table 3.

Table 3

Kentucky Public School Student Scores by Year

First, notice that the number of graduates increased between 2015 and 2016 from 45,162 to 46,285, a 2.5 percent rise. However, since 100 percent of the graduates were tested in both years, this rise probably should not be related to any score changes (in other words, this should not be adding more weak students to the mix as happened when Kentucky moved to 100 percent testing in 2009). This is a different situation from the case where a state moves from say, 39 percent, as Alaska posted in 2015 to 53 percent in 2016 (See Page 14 here). Alaska’s ACT Composite Score shifted from 21.1 to 20.0 as a result of this large participation rise, which included a lot more students who probably performed relatively low compared to the 2015 group.

Next, note that the score patterns present from 2013 to 2016 for the all student case in Table 1 are mirrored in the public school only results. However, the gains in both English and reading are not as strong, while the drops in math and science are just as large. Again, the Kentucky public school only ACT Composite Score for 2016 was flat compared to last year. So, at this point there is cause to pay attention, but not major concern.

Now, here is the table you won’t find anywhere else – the non-public school student performance in the Bluegrass State.

Table 4

Kentucky Non-Public School Student Scores by Year

Non-public ACT participation numbers also rose in Kentucky between 2015 and 2016. In fact, the 3.4 percent increase was even larger, on a percentage basis, than for the state’s public schools.

Score changes were quite different in some cases. The Kentucky non-public graduates’ ACT English score rose by a full 1.2 point, a rather notable increase. The non-public graduates’ ACT English score eclipses the public school score by 5.6 points, truly a huge difference on this 36-point assessment.

Math for the non-public graduates, however, fell by 0.3 point, a 0.1 point higher decline than the public school drop. Never-the-less, the non-public math score still remains well above the public school score by 2.2 points, which is a notable difference.

For reading, the non-public graduates also increased dramatically, up by 1.3 point. In reading, the non-public grads of 2016 outperformed their public school counterparts by a very large 4.5 point margin.

Some might think that the non-public schools, given their sometimes tight finances, would have problems doing science well. However, while the Kentucky non-public school student science score did slide by 0.2 point between 2015 and 2016, it remains well above the public school score by 3.4 points on ACT’s 36-point scale.

Finally, the ACT Composite Score for Kentucky’s non-public school graduates of 2016 surpasses the public school grads by 3.4 points, which is also a notable difference on the ACT scoring scale.

So, it is clear that Kentucky’s limited school of choice options – generally only available to wealthier parents – do outperform. One wonders what good things might happen for our public school kids if we opened up more choices for them, too.

General Tech Notes: ACT reports for 2016 are available here.

ACT reports from earlier years are available here.

The various sources for the data in Tables 1, 3 and 4 are extensive and can be found in our 2016 ACT Summary.

(Blog Title Corrected August 28, 2016 at 3:27 pm)