How do charter schools really perform?
Getting an accurate picture isn’t easy
For a number of years, school choice opponents have been expending major effort to try and prove that charter schools, which offer an alternate education choice to students regardless of their backgrounds and income levels, actually don’t perform well. These choice opponents point to some reports that generally show charters doing no better than traditional public schools. But, are these reports providing an accurate picture?
In fact, a very important research finding that has been mentioned in numerous reports helps to explain how charters might actually be doing quite well while inadequately conducted research doesn’t show it. Incredibly, this finding is even found, buried, in a report that charter critics love to cite.
Very simply – and unsurprisingly – it takes time for charter schools to turn around students who often enter these schools of choice performing well behind their grade level. Expecting students to magically blossom as soon as they enter a charter school just doesn’t happen. Give a student two or three years in a charter, however, and report after report shows those more experienced charter kids are moving out ahead of their counter parts in traditional public schools (TPS).
One consequence of the need for time is that reports about charter schools that don’t examine impacts over time will inevitably short-change charter schools. In addition, studies that include a lot of first-year charter school student results in the overall charter school comparison scores will be insensitive to the real capabilities of these schools of choice.
So, beware studies that don’t examine impacts of charter schools as students spend more time in them, and also understand that anytime a charter to TPS comparison includes first-year charter students, that study will tend to be biased against the true charter school performance picture, as well.
If you want more details and references for these facts about charter schools, just click the Page 2 link below.
One of the first reports to address the charter school impacts take time issue comes from the Stanford University-based Center for Research on Education Outcomes’ (CREDO). CREDO’s “Multiple Choice: Charter School Performance in 16 States” report was issued in 2009.
To begin, Multiple Choice created quite a stir when it was issued. The report states that only 17 percent of charter schools:
“…provide superior education opportunities for their students. Nearly half of the charter schools nationwide have results that are no different from the local public school options and over a third, 37 percent, deliver learning results that are significantly worse than their student would have realized had they remained in traditional public schools.”
Charter critics still cite that now 10-year-old figure about just 17 percent of charters outperforming even though it not only is out of date, but it was a biased figure even back at its release in 2009.
The curious thing about Multiple Choice is that buried in a section starting on Page 32 of the report is a crucial finding that helps explain why that low, 17-percent figure doesn’t provide a good picture of true charter school performance. This part of the report deals with the impacts of charters on student performance over time.
This effects-over-time section of the CREDO report begins by adding more information about which charter school students were included in the data that led to the 17-percent claim. The report then makes an interesting comment:
“One of the possible explanations for the observed national results may be linked to the rapid growth of the charter school population each year and subsequent new school openings, leading to many of the charter students included in this study having been enrolled in their schools for only a few years. One wonders if students with more years of charter schooling might have different results compared to relative newcomers.”
CREDO then adds more information, saying:
“Because the number of students attending charter schools grows each year, the experience of charter school students reflected in each state’s data is skewed toward first‐year charter students. More than half of the records in this analysis capture the first year of charter school experience.”
In other words, CREDO’s 17-percent finding was based largely on the performance of students who were only in their first year in their charter school. Thus, if there are notable changes in how charter school students compare to their traditional Public School (TPS) counterparts as they spend more years in charters, the 17-percent finding could actually represent an unfair comparison. So, how do charter students do over time.
CREDO ends the suspense about the impacts over time by revealing that their research did indeed find that as charter school students spent more time in their school of choice, they did as a group move ahead of their TPS counterparts all across the 16 states (actually 15 states plus the District of Columbia Schools) that were investigated.
CREDO summarizes this finding as:
“Students do better in charter schools over time. First year charter students on average experience a decline in learning, which may reflect a combination of mobility effects and the experience of a charter school in its early years. Second and third years in charter schools see a significant reversal to positive gains.” (Page 6)
This CREDO finding is an incredibly important for several reasons.
For one thing, parents who move their children to a charter school need to understand that it takes some time for a kid to adjust to the new academic environment and patience is necessary. The real ability of the charter to work with their child might not become evident in year one.
CREDO’s finding also indicates that if researchers really want to know how well charter schools perform, they have to look at student performance over time. If a researcher only does a shallow and simplistic study that averages in first-year charter school student grades scores with those for students who have spent more time in the charter, they won’t produce an accurate picture. Such simplistic studies will be inherently biased against charter schools.
Unfortunately, this is what happens in far too many reports that say charters are not doing well. Those reports never give charters a fair chance.
Since its first, 2009 report, CREDO has issued many more charter studies, some again looking at a national sample and others focusing on individual states and even some large city systems. The findings about charters looking far better when comparing students who have spent enough time in these schools of choice to really benefit show up in virtually all the follow-on CREDO reports.
One of the most interesting follow-ons from CREDO was a state-level report for Louisiana. It was issued in 2013, eight years after Hurricane Katrina virtually destroyed the school infrastructure in the southern region of the state. In this CREDO report, the findings about charter impacts over time, displayed in Figure 9 (shown below), are truly amazing.
As you can see, by the time students have spent four or five years in a Louisiana charter school, CREDO says they outperform their TPS counterparts by about 180 school days. That is essentially a full extra schoolyear of learning! Oh, boy, could we use that in Louisville – and, in a lot of other areas of Kentucky, too!
The most recent report from CREDO, just released in January 2019, examines charter schools and TPS in Indianapolis, Indiana. While the report isn’t as detailed as some earlier CREDO studies, some of the findings regarding charter school students versus TPS students are notable. These include:
Compared to the state during the period from2014-15 to 2016-17, Indianapolis charter school students posted similar gains in both reading and math. Students attending Indianapolis TPS continued to exhibit weaker growth in both subjects.
Furthermore, students in Indianapolis charter schools experienced stronger growth than Indianapolis TPS students in both subjects during the study period.
Black students enrolled in charter schools post significantly stronger growth than black students in TPS in both subjects. Hispanic students attending Indianapolis charter schools post stronger growth than TPS Hispanic students within the city.
Within Indianapolis, charter school students living in poverty exhibit stronger growth in both reading and math compared to TPS students in poverty. We also find stronger growth in both subjects for Indianapolis ELL students enrolled in charter schools than that for ELL students in Indianapolis TPS.
CREDO’s new Indianapolis study doesn’t include any information on effects over time, probably because only two years of data were available, so the bullet comments above, glowing as they are, might still under-rate true charter school performance. However, given the strength of the bulleted comments above, there is even more reason to believe that charters in Indianapolis really are outperforming.
By the way, CREDO certainly isn’t the only group that has found charters to really outperform once you consider impacts over time. A really excellent report on charter schools in Massachusetts, which approaches a true random sample study with its process, was also issued in 2009. That’s when the Boston Foundation partnered with the Massachusetts Department of Elementary and Secondary Education to issue “Informing the Debate, Comparing Boston’s Charter, Pilot and Traditional Schools.” Created by some heavy-duty talent from places like MIT and Harvard University, this report also shows charter advantages grow as students spend more time in the charter.
Using the same conversion between standard deviations grading and days of learning developed by CREDO, between the sixth and eighth grade Figure 4 in the study indicates that Boston charter students gained more than a year of extra growth compared to their TPS counterparts in the city. Again, Kentucky could really use that sort of performance in many areas of the Bluegrass State.
In closing, when you examine the evidence about charter schools, exercise caution. Reports that don’t look at impacts over time probably have an uncorrected bias against charter schools. Once you recognize this and look at more thoroughly performed studies, the success of charter schools becomes even more apparent. During the rest of School Choice Week, we will examine some specific examples of that evidence.