Updated: At the end of this post is a comment from John Perry, a reporter and data specialist at The Atlanta Journal Constitution.
The following post was written by Gary Miron, professor of education at Western Michigan University who has extensive experience evaluating school reforms and education policies. Over the past two decades he has conducted several studies of school choice programs in Europe and in the United States, including nine state evaluations of charter school reforms. Before coming to Western Michigan University, he worked for 10 years at Stockholm University in Sweden.
By Gary Miron
A year ago, I was one of four academics or test specialists that advised USA Today and its affiliated Gannett newspapers on a multistate analysis of irregularities in assessment data. These journalists worked with an excellent dataset – it had data at the level of individual students, which allowed us to ensure that erratic patterns reflected changes in the scores of individuals, rather than changes in the composition of a school (due to changes in enrollment at a school, for instance). The USA Today dataset also allowed the reporters and analysts to reveal erratic patterns in test scores and erasures rates from year to year.
The resulting USA Today story was therefore able to present a thorough and comprehensive analysis that resulted in the identification of relatively few schools at which there was a very high likelihood that systematic cheating was taking place. The journalists responsible for this story were later recognized with the Philip Meyer Award for investigative journalism. The series of articles resulting from this study prompted a federal Department of Education investigation into the testing practices at District of Columbia schools and it also resulted in a tightening of security around testing.
Given my past role in reviewing data and methods used for detecting systematic cheating, I was delighted to have the opportunity a week ago to review Ohio assessment data that was being used as part of a national study released today by The Atlanta Journal-Constitution and affiliated Cox newspapers. My review, however, yielded serious concerns about the data used, the methods of analysis employed, and the conclusions drawn. I shared these concerns with journalists at the Dayton Daily News, which is one of the Cox affiliates involved in this story.
To be clear, the Cox analysis may accurately detect large variations in assessment results from year to year. But my own analysis of the data suggests that these irregularities are less likely due to actual cheating than due to mobility in student population (recall the lack of student-level data). Although the Cox news articles on this study offer a disclaimer that their analysis does not actually prove cheating, this disclaimer should be expanded considerably.
In short, here are some of my concerns about the methods:
* As noted, the analysis is based on school-level data and not individual student-level data. Accordingly, it was not possible to ensure that the same students were in the group in both years.
* The analysis of irregular jumps in test scores should have been coupled with irregularities in erasure data where this data was available.
* The analysis by Cox generates predicted values for schools, but this does not incorporate demographic characteristics of the student population.
* The limited details available on the study methods made it impossible to replicate and verify what the journalists were doing. Further, the rationale was unclear for some of the steps they took.
Yes, there may in fact have been cheating in some of the schools and districts flagged in the report. But it seems likely to me that most of those flagged were not in fact engaged in cheating. If a more thorough and rigorous analysis had been conducted, the number of flagged schools and districts would have declined substantially. With a more focused list of schools and districts, the journalists could have then contacted the schools to see if there were reasonable explanations for irregularities detected.
The resulting news story appears to be intended to be alarmist, implying that cheating is rampant in our schools. It is fortunate that the journalists in Ohio at least have restrained from reporting the names of the specific schools flagged, since suspicions would have been unfairly cast on hundreds of improperly flagged schools. The irregularities in such schools likely arose simply because there was a large change in the actual students taking the test from year to year.
Given that the methods used were much more likely to identify schools with high mobility, it comes as no surprise that charter schools are highly represented in the flagged schools and that the Houston Public Schools also garnered considerable attention. Recall that Houston was heavily impacted by the influx of students relocating there after Hurricane Katrina in 2005, which is the starting year for the Cox analysis.
We all need to be concerned about cheating and its implications. At the same time, we need to be leery of sensational attempts to secure headlines with weak and incomplete analyses.
The increasing focus and reliance on standardized tests to evaluate schools and teachers is resulting in cheating. That’s probably inevitable. But it’s also probably minimal. The bigger problem is a more serious type of cheating – one that’s perfectly legal and apparently acceptable. Students are being cheated of a broader education that emphasizes a balance of creativity, extracurricular activities, foreign languages, higher math and science skills and other opportunities due to the over-emphasis on testing for basic math and reading. In this sense, a fixation on testing cheats not only our students but also their communities and the future employers who will depend on their creativity and can-do problem-solving. And our democracy is certainly cheated when our youth are unprepared for healthy civic engagement.
Yes, it is important for reporters and others to seriously pursue stories about schools engaged in wrongful practices. The groundwork done by the Cox reporters is part of that (although I wish they had pursued their investigation further and more carefully before publication). But we as a nation are missing the forest for the trees. No cheating on tests is as serious as the cheating done by the tests.
This is a response from John Perry, a reporter and data specialist with the investigative team at The Atlanta Journal-Constitution.
From John Perry:
Dr. Miron reviewed an early version of our results and shared his comments with our sister paper in Dayton. Those comments were conveyed to the statistical consultants at the University of Georgia who were helping us with the research. The Dayton paper later provided Dr. Miron a copy of the revised findings after we made several adjustments based on the analysis by the UGA statisticians and the comments of others with whom we shared the early results. Dr. Miron made no further comments and asked no further questions.
The issue of student mobility that Gary Miron and others have raised was reviewed by our statistical consultants. A high rate of mobility is a characteristic of virtually all inner city high-poverty districts. If it were true that our methodology just flagged mobility instead of potential cheating, then you would expect all urban districts with high mobility to be flagged. This was not the case. In the Ohio data which Dr. Miron reviewed, for example, Cleveland schools, with a better than 30 percent mobility rate, had an average 4 percent of classes flagged by our analysis in 2008-2011. Statewide, about 5 percent of classes were flagged in those years.
Follow The Answer Sheet every day by bookmarking www.washingtonpost.com/blogs/answer-sheet.