Education Pre K-12
June 9th, 2020 11 Minute Read Report by Marcus A. Winters

Do Charter Schools Harm Traditional Public Schools? Years of Test-Score Data Suggest They Don’t

When the number of charter schools in a given area increases, are students who remain in traditional public schools worse off? This is a claim often made by opponents of school choice; gains made by students in charter schools, they say, come at the expense of students left behind. There is scant evidence to support this view in the existing literature, which suggests that charter schools have either no effect or even a small positive effect on students in traditional public schools. Admittedly, though, much of this research focuses on short-run test-score outcomes, and thus might miss any longer-term negative effects. Therefore, in this report, I take a more descriptive approach to the evidence on the relationship between charter schools and declines in public school quality. Using school-level test-score data across the United States made available by Stanford Education Data Archive (SEDA), I show that there is a very small but positive relationship between the proportion of students within a geographic district who attend a charter school as of 2009 and the test-score growth for students enrolled in the traditional public schools in the same district over the next seven years.

The analysis in this report is intended not to show causality, but rather to show that the general pattern of testscore outcomes over this period is simply not consistent with the claim that charter school exposure for a meaningful period of time produces declines in the performance of traditional public schools.

DOWNLOAD PDF

Introduction

Critics of school choice often argue that charter school growth reduces the quality of education in traditional public schools. Large charter school sectors, they say, rob local public schools of valuable resources and the most promising students. Thus, any gains made by the minority of students in charter schools come at the expense of other students in the same locality, who are left behind in the surrounding traditional public schools. Prior empirical research provides little support for such claims. In fact, this fairly expansive body of research suggests that expansion of charter schools and other forms of school choice has either no effect or a small positive effect on the academic outcomes of students who remain in local traditional public schools.[1] Much of this research, however, focuses on short-run test-score outcomes, which is a possible limitation, given that any negative effects of charter school growth could take a few years to manifest.

In this paper, I take a more descriptive approach to the evidence on the relationship between charter schools and declines in public school quality. We are now more than two decades into the era of charter school expansion across the United States. As of 2009, charter schools served at least 10% of students in 91 of the 947 U.S. school districts with at least 10,000 students. If charter school expansion has caused declines in traditional public school outcomes, they should be apparent by now.

Using school-level test-score data across the United States made available by Stanford Education Data Archive (SEDA), I show that there is a very small but positive relationship between the proportion of students within a geographic district who attend a charter school as of 2009 and the test-score growth for students enrolled in the traditional public schools in the same district over the next seven years.

The analysis in this paper is not intended to show causality—that is, I do not attempt to compare the outcomes within areas with more charter school exposure with those that would have happened had charter schools not been present. The evidence I present is less specific but, arguably, at least as damning for the claim that charter schools harm traditional public schools. In short, I show that the general pattern of test-score outcomes over this period is simply not consistent with the claim that charter school exposure for a meaningful period of time produces declines in the performance of traditional public schools. Despite previous dire predictions, the experience in the U.S. over the last several years suggests that student performance has improved in some school districts and declined in others, regardless of their level of exposure to competition from charter schools.

Data

The primary source of data comes from the Stanford Education Data Archive (SEDA). The SEDA project assembled achievement data for third- to eighth-graders in all public schools across the country, for 2009–16. It also draws on the American Community Survey (ACS) and the Civil Rights Data Collection (CRDC) to provide demographic data for each school and school district.

In addition to compiling these data, SEDA authors make several original contributions to the data set. Most important, the authors produce uniform measures of average test scores and test-score growth that are comparable across states, even though each state has its own standardized test. By comparing state test results (which are available for all students in the state) with NAEP scores (which are available for only a small subset of the state’s fourth- and eighth-graders), the authors derive standardized, NAEP-referenced scores for each school and school district.[2] We use two measures of student performance for schools and geographic districts that SEDA authors developed using these data:

  1. Cohort slope. The cohort slope describes the rate at which test scores change across student cohorts, within a grade. For example, a positive cohort slope would indicate that the school’s increase in performance in the fourth grade improved, on average, between 2009 and 2016.

  2. Grade slope. The grade slope describes the rate at which test scores change across grades, within a cohort. For example, a positive grade slope would indicate that between 2009 and 2016, the test-score gain within a given cohort was larger between the sixth and seventh grade than it was between the third and fourth grade.

SEDA authors also compile schools into “geographic school districts” that include all public schools under the jurisdiction of the school district, as well as any independently operated charter schools within its bounds. We utilize this information in two ways. First, we use the proportion of students within the geographic school district who are attending a charter school in 2009 as our primary independent variable. Second, we analyze outcome (average score, cohort slope, and grade slope) for all schools (traditional public or charter) within a geographic zone, as well as for only the traditional public schools in the area.[3] The former analysis takes into account all students within the community, while the latter focuses specifically on those who remain in traditional public schools.

The analysis includes districts with at least 10,000 students enrolled, which account for about half the nation’s public school students.

Method

I use SEDA data to address a simple descriptive research question: What is the association between the proportion of students enrolled in charter schools within a geographic school district in 2009 and its students’ test-score growth between 2009 and 2016?

To address this question, I run regression models where the dependent variable is a measure of average test-score growth for students within the geographic district between 2009 and 2016 (grade slope or cohort slope), and the independent variable of interest is the proportion of students within the district who were enrolled in a charter school as of 2009. The primary models include controls for urbanicity as well as racial composition and measures of community socioeconomic status as of 2009, though the results are qualitatively similar without such controls. I present the results from unweighted models, though weighting for the number of students in the district has no meaningful effect on the estimates.

Because my focus is the relationship between charter school exposure at a given point in time and later outcomes, the analysis considers test-score growth through 2016 and charter school exposure as of 2009, not the growth of the charter sector during that period. Though it does not impede interpretation of the main results, it is worth noting that later outcomes could be influenced by continued growth of the charter sector after 2009. There is a statistically significant but mild correlation (r = 0.18) between the proportion of students in charter schools in 2009 and the change in charter school enrollment between 2009 and 2015.

Results[4]

The regression results reported in Figure 1 estimate the relationship between the proportion of students enrolled in charter schools in 2009 and later test-score growth within a district, on average. It reports the coefficient estimates for the relationship between the proportion of students enrolled in a charter school in 2009 and measures of student test-score growth within a geographic school district, both overall and exclusive to traditional public schools. The results in the first row come from models that include only traditional public schools, which is our primary concern. In both math and English Language Arts (ELA), there is a statistically significant but very small positive relationship between 2009 charter school exposure and the rate at which test scores change across student cohorts (cohort slope). There is no statistically significant relationship between 2009 charter school exposure and the rate at which test scores change across grades within a cohort (grade slope).

The second row in the table reports results for all students— in charters and traditional public schools— within the geographic district. These models evaluate the extent to which charter exposure is associated with overall changes in student test scores within the area, regardless of the sector that the students attend. There is a significant but small positive relationship between 2009 charter school exposure and overall test-score growth within an area on both subject tests.

Figure 2 widens the scope of our analysis by illustrating the relationship between 2009 charter school exposure and math growth, according to the cohort-slope measure for traditional public schools only.[5] Analyzing the full sample in this way makes clear just how limited is the relationship between charter exposure and traditional public school outcomes across the U.S. Each dot on the figure is a geographic school district. The horizontal axis is the percentage of students who were enrolled in a charter school as of 2009, and the vertical axis is growth according to the cohort-slope measure between 2009 and 2016, after controlling for other demographic factors.[6] A school district at the zero-point on the vertical axis, then, had no change in its cohort slope compared with districts with the same demographics. The line running through the middle represents the regression’s estimate (reported in Figure 1) for the relationship between 2009 charter school exposure and the cohort slope. Dots that fall below the line are districts that experienced lower growth on the cohort-slope measure than predicted by the regression, and dots above the line are districts that gained more than predicted by the regression.

Notice that the regression line is very flat, with only a slight upward trajectory. A completely horizontal line would imply the complete lack of correlation between charter school exposure and cohort slope. This result is consistent with the result reported in Figure 1. However, the figure clarifies just how weak the relationship is across geographic districts, regardless of their level of charter school exposure.

The most important insight to gain from Figure 2 is that at just about every point on the horizontal axis— that is, for any amount of charter school exposure— there are almost as many districts below the axis as there are above it. Among districts with very high charter exposure, some did make meaningful gains on the cohort slope while others saw declines. Similarly, among districts with little or no charter exposure, similar numbers made increases or decreases on the cohort-slope measure. While it is easy to find specific examples of areas with high charter exposure and high/low changes in outcomes, when we look at all geographic school districts, no clear relationship in the data emerges.

Summary

For many, the idea that competition from charter schools leads to lower student outcomes within traditional public schools is intuitively plausible. But after more than two decades of rapidly expanding charter school sectors, all across the country, there is little evidence to support that idea. Between 2009 and 2016, there was little, if any, relationship between the proportion of charter school students within a geographic district at the beginning of the period and the test-score growth within traditional public schools by the end of the period.

The results from this analysis should not be construed as proof that charter school expansion does not affect traditional public school outcomes. The findings described above are entirely descriptive. The analysis cannot rule other factors that potentially have systematically influenced the test-score outcomes of districts in areas with more or less charter school exposure.

However, even if it were true that charter school exposure did hamper public schools, our analysis here suggests that, in practice, public school systems have responded in ways that counterbalance that negative impact. This analysis—combined with recent studies within localities (studies that are designed to make such causal claims) that have found that charter exposure has either no influence, or a positive influence, on public school outcomes—is compelling, if not dispositive. The burden of proof remains on those who argue that expansive charter school sectors hurt students in traditional public schools.

Endnotes

See endnotes in PDF

Donate

Are you interested in supporting the Manhattan Institute’s public-interest research and journalism? As a 501(c)(3) nonprofit, donations in support of MI and its scholars’ work are fully tax-deductible as provided by law (EIN #13-2912529).