But below is a critique of a new report on how charter schools affect the finances of public school districts. I am publishing this because the report being reviewed received publicity for its contention that charter schools don’t hurt the finances of public school districts and it was sent to policymakers — even though the data on which it is based does not support that conclusion. Some of this is technical, but the issue is an important one.
The report, “Robbers or Victims: Charter Schools and District Finances," was published by the Thomas B. Fordham Institute, a conservative think tank that supports charter schools and standardized-test-based school reform.
The critique was written by Carol Burris, a former award-winning school principal who is now executive director of the Network for Public Education, a nonprofit advocacy group that opposes the expansion of charter schools and standardized-test-based reform. She has been chronicling school reform on this blog for years.
After the post is a comment from Mark Weber, whose work was the source material for the Fordham report, and a response from Fordham.
By Carol Burris
A recent study published by the Thomas B. Fordham Institute, titled “Robbers or Victims: Charter Schools and District Finances," was rolled out with fanfare and sent to policymakers across the country. When the Fordham Institute sent out its mass email, its subject line read: “New report finds charter schools pose no fiscal threat to local districts.” That subject line is unsupported by its own flawed study.
The study was funded by the Walton Foundation and the Fordham Foundation, the Fordham Institute’s related organization. It is worth noting that the Fordham Foundation sponsors 11 charter schools in Ohio, for which it receives administrative fees, and the Walton Foundation is a major funder of charter schools.
The origins of the study, unacknowledged in the report, is author Mark Weber’s 2019 doctoral dissertation. Advocacy organizations are often accused of cherry-picking examples. With “Robbers and Victims,” Fordham cherry-picked a study on which to base its puffery. In a Fordham podcast and on his blog, Weber reports that Fordham approached him to author its report after they read his dissertation, which is composed of three papers, the first of which is the basis of the Fordham report. Weber is a public school teacher in New Jersey and instructor of public school finance at Rutgers University in New Jersey.
In both the dissertation and the report, Weber attempts to show the association between charter growth and districts’ finances in revenue and spending as charter schools expand. He found that in most cases in the states whose data he analyzed, revenue and expenditures either increased or stayed the same when the number of students attending charters located in the district went up. In all cases, there is no evidence of causation, just correlation.
For those not familiar with the distinction, a correlation occurs when two observations follow the same trend line. It does not present evidence that one causes the other. The classic example is the correlation between ice cream sales and murder rates — both are higher during summer months in big cities and then drop as the weather gets cooler. Then, there are hilarious examples of spurious correlations that show the associations between such oddities as the age of Miss America and murders by steam, hot vapors and hot objects.
Yet, despite knowing better, Fordham latches onto the correlation and concludes that it appears charters do no financial harm to districts.
And in its news brief about the report, the National Alliance of Charter School Authorizers takes the claim one step further, saying: “Their findings show that if anything, increasing charter school enrollment has a positive fiscal impact on local districts.” That is false and misleading. “Impact” means that the study can support a causal inference. It clearly does not. But that is not the end of this study’s problems.
The critical question not posed
There is an obvious question that is neither posed nor answered. How do increases and decreases in district revenue and spending compare to districts without charters? Are the comparative rates higher, lower or the same?
I read the Fordham report and Weber’s dissertation three times in search of that answer or at least a discussion of the limitation. To ensure I was not misinterpreting the analysis, I emailed Bruce Baker of Rutgers University, a professor and national expert on school finance — and Weber’s dissertation adviser.
Baker acknowledged the absence of comparative data and then went one step further (quoted with his permission). He said: "Comparing districts experiencing charter growth with otherwise similar districts (under the same state policy umbrella) not experiencing charter growth is the direction I’ve been trying to push this with a more complicated statistical technique (synthetic control method).
“But even with that, I’m not sure the narrow question applied to the available imprecise data is most important for informing policy," Baker said. “The point is that the entire endeavor of trying to use these types of data — on these narrowly framed questions — is simply a fraught endeavor and one that added complexity can’t really solve.”
Consider the following oversimplification of the problem. Between 2013 and 2018, national spending on K-12 education has increased 17.6 percent as states recovered from the Great Recession. That is the average. Spending increases in the states ranged from a 2 percent decrease in Alaska to a 35.5 percent increase in California. Vermont, with no charter schools at all, had an 18 percent increase in spending. If we look at this from a national perspective, it is a safe guess to expect that revenue followed an upward slope similar to spending. So did the proliferation of charter schools. And so, frankly, did my age.
Bad data and big limitations
The study rests on the measurement of what Weber refers to in his dissertation as "charter penetration," which attempts to capture the increase in the percentage of students who attend "independent charter schools" in a district with such charter schools within it.
The Fordham study defines “independent charter schools” as "schools that are not part of their host school districts. “Independent charter schools” is a construct created by Weber to deal with the federal government’s coding and state funding complexities. About half of the states with charter schools during those years were excluded. Six of the 21 states included had fewer than 3 percent of the state’s students in “independent” charter schools. From this, big generalizations were made.
But there are more serious problems. Weber’s computations include the number of all students that attend independent charter schools physically located in the district. However, students from outside the district can attend charter schools, too. Weber acknowledges this problem in his dissertation and on page 21 of the Robbers or Victims report but claims its effects are limited. I believe, however, that it is highly problematic. Here are a few examples.
On pages 55-57 of his dissertation, Weber lists the 100 school districts with the largest charter proliferation rates. Some of the California rates are absurd due to that state’s policy of allowing districts to authorize charters in other districts. Let’s set those aside.
Midland Borough School District in Pennsylvania is listed as having a total population of 10,312 students in its public and charter schools. This is surprising because the Midland Borough School District has only 272 students in its K-8 schools. It has no high school. However, within the district boundaries of this K-8 district, there is the physical address of a virtual school, the PA Cyber Charter that pulls students from all over the state, and a brick and mortar specialty charter high school, the Lincoln Park Performing Arts High School.
Wainscott Common School District in New York is a one-room schoolhouse that serves 26 students in grades K-3. According to Weber’s list, in 2015, over 76 percent of the students attend a charter school located in the district. What charter school would that be? For a time, the Child Development Center of the Hamptons Charter School was physically located in the district. It was a K-5 school, primarily for special-needs students (44 percent students with disabilities) designed to serve children who lived within a 50-mile radius of the school. Might one or two of this tiny district’s students attend it? Perhaps. But not 76 percent.
The problem of identifying districts as having "high charter penetration" when they do not goes well beyond the issue of including virtual charter schools (which in 2017 enrolled 300,000 students), regional charter schools, and special needs charter schools. Charter schools, especially those located outside of cities, pull from other districts.
Let’s take, for example, Weber’s home state of New Jersey. Weber placed the Frelinghuysen Township School District on his top 100 list. It is a small elementary district that, in the 2017-2018 school year, had only 142 students. Within its boundaries is the Ridge and Valley Charter School. Weber reports that 46 percent of the district’s students attend the charter. That is likely not true.
Based on data obtained from an Open Public Records Request from the New Jersey Department of Education, we know that two years later, only nine district students were attending the charter school. The rest of the school’s students came from numerous surrounding districts, the data showed. That same year, only 12 of the 213 students from the Sparta Technology Charter School came from the large home district of over 3,000 students. And 96 of the 242 students of Unity Charter School came from the Morris School District of over 5,000 students, in which it was located.
The assumption that nearly all students in charter schools come from the district where the school is located is fallacious in states with many small, local districts and porous charter attendance zones. And it is nearly always the case with virtual charter schools. Some districts are included that should not be included. Error is introduced into district calculations, which in turn affects state results.
What about taxpayers? And how should we determine cost?
Weber rightly concludes that some of the observed increases in per-pupil spending are due to the inefficiency of running dual systems, charter and public. When 25 or 30 students leave an elementary school for a charter, you can't get rid of the principal or lower the heat. You may not be able to decrease staff if the students who leave are from various grade levels. These are known as stranded costs.
Weber refers to it as inefficiency. Superintendent Joe Roy of Bethlehem, Pa., calls it costs that he is forced to pass on to the taxpayers.
Back in 2017, I asked Roy how much charters cost his taxpayers. Roy told me that the district budgeted $26 million (about 10 percent of its annual budget) that year to pay for tuition and associated costs to charter schools. According to Roy, “We estimate that if all of the students in charters returned, even with hiring the additional needed staff, we would save $20 million. This is the cost of school choice.”
What Roy and his business staff did — determine how costs would rise if charter students were in the district and then deduct the costs associated with sending them to charters — is the only true way to determine the effects of charter schools on district finances.
In 2018, In the Public Interest, a national nonprofit research and policy organization, did precisely that kind of analysis to determine the extra costs of charter schools in three large California school districts. Their analysis found that the studied districts lost tens of millions of dollars each year that would be recouped if students attended public schools instead of charters. You can read that report, written by University of Oregon professor Gordon Lafer, here.
Such reports are tedious, difficult work best done in the small. They do not lend themselves to models cranked out of statistical package programs. They require sampling in states and, depending on how each state finances charter schools, results will vary. However, taxpayers deserve an accurate picture of charter cost, not a guesstimate.
Despite Weber’s sincere efforts, Fordham’s “Robbers or Victims” makes no substantive contribution on how to determine those costs other than to demonstrate what a fraught endeavor it is to answer such an important question with big, unexamined datasets and incomplete federal reports. Weber has made it clear what the study says and does not say.
However, Fordham’s distribution of the report to state Boards of Education, commissioners and other policymakers with a misleading headline does a disservice to public schools and to taxpayers alike.
Here’s a comment from Mark Weber:
My thanks to Dr. Burris for her thorough reading of my paper. I’ve addressed many of her critiques on my blog, which readers can find here:
Let me add a few points:
- Work like this, which uses large datasets and econometric methods, is always subject to limitations of both the data and the strategies employed. It is true that my method for determining the level of charter school enrollment share can’t account for enrollments across school district boundaries. However, it is the most reasonable proxy we have using the federal data. As I note, other studies, similar to mine, have used this method and compared it to actual enrollment data from individual states; these studies find similar correlations between charter share and district fiscal measures when using both methods. This includes my own work using New Jersey data:
- While I take Dr. Burris’s point about causation vs. correlation, the finding of a correlation remains, to my view, an important one. Using the standard methods I employ, there does appear to be a link between charter share and “hosting” district spending and revenues, even when accounting for differences between districts and overall trends. This runs counter to the pronouncements of many charter critics and supporters.
- The most reasonable explanation, to my mind, is that as districts lose students to charter schools, they must still cover their fixed costs. In other words: if a district loses 10 percent of its enrollments to charters, it can’t easily cut the costs of, for example, maintaining its school buildings by ten percent as well. In several states, the evidence suggests that local revenues are the source of this increased per pupil spending. This makes sense as, in many states, charter schools are directly funded by the state and not localities.
- That said, I agree that my report does not provide evidence that charter school growth does not harm school district’s fiscal health. That fact is that this report can’t answer that question. My hope, however, is that it does provide a framework for having a more informed discussion about the costs of charter schools on the entire K-12 system.
Fordham said in a statement that its reports findings “are subject to interpretation but that it stands by its report. It said in part:
"Dr. Weber’s finding that districts’ instructional spending per pupil — which includes things like teacher salaries, benefits, and classroom supplies — didn’t decline significantly in any state suggests that whatever additional ‘costs’ districts incur are covered where it really matters.
“In other words, we stand by our interpretation of the findings, which is that independent charter schools don’t pose a fiscal threat to local school districts that experience sizable increases in total revenues per student.”
It also said that the institute’s position on charter schools “is informed not only by the thirty-plus studies we’ve published on the subject but by our work as a charter school authorizer in Ohio,” and that the fees it receives for overseeing its schools “only cover our authorizing expenses” and that any excess funds are returned to the schools.