Every now and then results from one international assessment or another come out and the United States inevitably winds up somewhere in the middle. There is a great hue and cry about what this means for the future of American democracy (nothing good) and public education is blamed (along with good-for-little teachers). The only solution is to speed up the current smorgasbord of school reforms, the irony being that neither they, nor their predecessor reforms, did anything to improve the U.S. ranking on these international tests. So what are we to make of all of this? James Harvey, executive director of the National Superintendents Roundtable, tells us in this post. Harvey’s Seattle University dissertation was the basis of “School Performance in Context: The Iceberg Effect,” which is available free on the web pages of both the Horace Mann League, an organization that works to strengthen public schools, and the National Superintendents Roundtable.
By James Harvey
International assessments of student performance are hot. Everyone (or nearly everyone) likes them. Winning nations in the league tables gloat. The losers whine. And politicians lust after them like cats craving catnip. But what these assessments leave out is as important as what they include. They are abused, misused, misconstrued and mostly nonsense — but I get ahead of myself.
Here are 10 things you need to know about large-scale international assessments such as the Program on International Student Assessment (PISA), the Trends in International Mathematics and Science Study (TIMSS), and the Progress on International Reading Literacy Survey (PIRLS).
1. These assessments were never intended to line up and rank nations against each other like baseball standings.
That’s right. The statisticians and psychometricians who dreamed up these assessments 50 years ago stated explicitly that the question of whether “the children of country X [are] better educated that those of country Y” was “a false question” due to the innumerable social, cultural, and economic differences among nations. But, hey, that’s just a detail.
2. The “international average” isn’t what you think it is. It’s not a weighted average of all the students in the world, but an average of the national averages.
This means that when calculating the “international average,” the 5,600 students in Lichtenstein, the 700,000 in Ireland, the 860,000 in Finland, the 5 million in Canada, and the 14 million in Japan carry exactly the same weight as the 56 million students in the United States.
3. These assessments compare apples and oranges.
Do you think there’s anything to be learned from comparing the average performance of 5,600 wealthy white students in Lichtenstein with 56 million diverse students in the United States? Really? How about comparing our students with students in corrupt dictatorships like Kazakhstan, religious monarchies like Qatar, or the wealthiest city in China (Shanghai) after it has driven the children of low-income migrants back to their home provinces? As a report released in January by the Horace Mann League and the National Superintendents Roundtable, “School Performance in Context: The Iceberg Effect,” makes clear, these are just a few of the peculiar comparisons that lie behind these international assessment results.
4. The accuracy of the national samples ranges from questionable to abominable.
The Iceberg Effect also draws attention to the difficulties of matching comparable and accurate samples in the dozens of nations that participate in these studies.
Two analysts, Martin Carnoy of Stanford and Richard Rothstein of the Economic Policy Institute, examined a sample of U.S. students in PISA and found it deficient on several factors, including an over-estimate of the number of low-income urban students and an under-estimate of the number of low-income rural students. That error alone, thought Carnoy and Epstein, lowered the U.S. average by one to two points. At the other end of the spectrum, we find the No. 1 PISA Shanghai school system, one that Asian scholars describe as “an apartheid system” because it has systematically excluded low-income migrant children. Officials with the Organization of Economic Cooperation and Development, which sponsors PISA, last year acknowledged before an inquiry in the British House of Commons that they had somehow overlooked at least 27 percent of the potential students in Shanghai.
5. The horse-race tables ignore differences in poverty, inequity, and social stress among nations.
Fifty years of research in the United States and abroad documents a powerful correlation between low student achievement and poverty and disadvantagement. Yet reports on these international assessments blandly turn a blind eye on the implications of this research. The data are clear: Poverty rates among American students are five times higher than they are in Finland. China aside, we have the highest rates of income inequality in the nine nations examined in The Iceberg Effect. The rate of violent deaths in American communities is eight times the average rate in the other eight nations and 13 times greater than it is in Japan. All of that is ignored in the orgy of publicity organized by the sponsoring agencies of these assessments to highlight their findings.
6. Assessment reports act as though social support for families was consistent and uniform across dozens of nations.
These international assessments also ignore how support for families with children varies among nations—cash support, in-kind contributions like food stamps, tax preferences such as the Earned Income Tax Credit in the United States, access to preschool, and paid home maternity leave. Of the eight nations for which figures are available on this indicator in The Iceberg Effect (China is missing), the United States ranks dead last in expenditures as a percentage of Gross Domestic Product supporting families with young children.
7. Assessment results for elementary school children are consistently ignored or under-reported.
Here’s something to puzzle over. These international assessments consistently report that the achievement of American students in Grade 4 (in reading, mathematics, and science) is quite high. But American results in middle school (seventh and eighth grades for TIMSS; mostly ninth and tenth grades for PISA) are only average. There might be something to learn here. But apart from cheap shots on the line of “the longer American students remain in school, the worse they do,” nobody has thought to explore the implications of this contrast. Could it be that disappointing relative performance in older U.S. students is related to their maturation and realization of the inequities and violence around them?
8. Assessment results for 15-year-olds are treated as though age 15 marks the end of the educational line.
OECD justified basing its PISA assessment on 15-year-olds because, in much of the world, that age was the end of school for most students. But that’s patently not the case for most advanced economies, including the United States. Most American students are still in high school at the age of 17, 18 or later, and then go on to college. Indeed, the United States is the original “second chance” nation. School dropouts in the United States can gain a high school diploma via the GED and even enroll in community college courses without a diploma.
9. There are no assessment results comparing the performance of secondary school graduates across nations.
Although TIMSS mounted an international assessment of twelfth graders (and the equivalent elsewhere) in 1995, seniors have not been assessed since. One consequence, says the report from the Horace Mann League and the National Superintendents Roundtable, is that policymakers and the public are making summative judgments about the relative success of their elementary and secondary school systems on the basis of data that were not designed to support such judgments (namely assessments of 15-year-olds) and cannot possibly support them.
10. The United States has the most highly educated adult population in the world in terms of years of schooling completed and possession of high school diplomas and college degrees.
Among the surprising findings in The Iceberg Effect was this: Despite 30 years of gloom-and-doom headlines about the failures of American schools and the near-panic that other nations were about to eat our lunch, the most highly educated adults in the world (those aged 25 and over) are Americans. Canada and the United Kingdom come close on some measures, but nobody tops us on any of them.
The average adult American has completed about 13.5 years of schooling; the average Chinese adult, about seven. Fully 89 percent of American adults hold a high school diploma, compared to 22 percent of Chinese adults. The difference among four-year degree holders is more stark: 31 percent of adult Americans hold a bachelor’s degree, a figure that in China shrinks to 4 percent. American 15-year-olds, by the way, represent 25 percent of high-achieving science students in the world, with Japan coming up in second place at 13 percent.
The United States still has many strengths left. We shouldn’t let the Debbie Downers of the world obscure that central truth.
A New Challenge
None of this is intended to sugarcoat the very real problem the United States faces in its schools. Our schools have done a reasonably good job with the traditional students they were designed to educate. Now they face a new challenge: a population in which the majority of students are, for the first time in our history, both low-income and children of color.
We need to get on with that task. But the challenge is not addressed by hyper-ventilating about highly questionable international comparisons of student achievement.