I am sick of report cards. I don't mean the ones I used to get at a school. Fortunately, I no longer remember those. I mean the report cards I receive from well-meaning research groups showing how well each state is doing in some educational endeavor.
They give each state an A, B, C, D or F, usually on something related to the No Child Left Behind law. Are the tests aligned with the lessons? Are the standards specific enough? Are the teacher qualifications tough enough? Are they spending enough money? Are they progressive enough? Are they traditional enough?
I grant you it is fun to give a whole state a D or an F in something. I can't get back at the college professor who (correctly, I regret to say) nearly flunked me in Chinese. But it would give me a pleasant feeling to tell, oh, let's pick a big state, Michigan, that it is doing D work in improving teacher quality.
Yet looking at all those grades makes my head hurt. Each rating system is different. Most are pretty vague. I am never quite sure what they mean, except in one case I just found that deserves wider attention than it is likely to get.
This unusually clear and compelling report card appears in the summer 2005 issue of Education Next | www.educationnext.org, a well-edited, lively and expensive ($7 an issue) journal of opinion and research on education put out by experts at Harvard and the Hoover Institution at Stanford.
The editors of Education Next lean toward the pro-testing, pro-charter school, pro-voucher side of the national education debate, but they often publish intriguing stories that conflict with what some critics might see as their political slant. Their new report card is refreshingly short -- just two pages for the explanation and the list of states. And it seems to expose shortcomings in both those states that are critical of No Child Left Behind and those that are not.
The new report card also helps explain something that has puzzled me for some time. In my research for a book on the KIPP schools, a group of public charter middle schools found mostly in inner cities, I have noticed that the average reading and math proficiency rates for students entering KIPP schools vary widely depending on which state they are in. The KIPP kids are very much alike in all 15 states and the District that have those schools. About 85 percent of them have family incomes low enough to qualify for federally subsidized lunches. Nearly all are African American or Hispanic. They live mostly in urban or rural neighborhoods where school achievement levels are very low.
And yet students entering the KIPP school in Gaston, N.C., have recently had passing rates on their state tests of 53 percent in reading and 75 percent in math, while the new students at the KIPP school in the south Bronx have had passing rates of 30 percent in reading and 43 percent in math on the New York tests. How can that be? The North Carolina schools have shown some improvement lately, but it is highly unlikely their proficiency rates in reading and math are twice as high as those in New York.
Obviously, some states define the word "proficient" differently than others, as the No Child Left Behind law allows them to do. It is easier to get yourself labeled proficient in North Carolina than it is in New York, and that to me is a problem. How are the citizens of different states to know how well their schools compare to schools in other states if the standards are all over the map? Americans move occasionally, and they should not have to scratch their heads about a new neighborhood school that seems very much like their old one in a different state, but claims a proficiency rate 30 percentage points higher.
There is, thankfully, a national test given to primary and middle school children in all states that can be used as a measuring stick for state standards. It is the National Assessment of Educational Progress (NAEP, pronounced 'nape'). Only sample groups of students take the NAEP test in each state, but it is enough to give us a rough idea of how much the state standards diverge from NAEP. The authors of the Education Next report card, Harvard political scientist Paul E. Peterson and American Enterprise Institute education policy studies director Frederick M. Hess, have used that comparative tool to show which states are pumping up their proficiency rates in what some might consider a deceptive way.
Under state standards in Texas, for instance, 87 percent of fourth graders are said to be proficient in math. Under the NAEP standards, only 33 percent of fourth grade Texans are considered proficient. Peterson and Hess say that gap is so large, as measured by a statistical tool called standard deviations, that it earns the Lone Star state an F on their report card. (And that shows that at least in this case politics is not getting in the way. Education Next's editors tend to support much of President Bush's agenda in education, but here they are trashing the home state of the president and both persons who have served as U.S. education secretary during his administration.)
In South Carolina, on the other hand, 31 percent of fourth graders were deemed proficient on the state's reading test, while 26 percent of fourth graders given the NAEP reading test were said to be proficient. That difference is small enough, Peterson and Hess say, to give South Carolina an A on their report card. Of the 40 states they assess, only 11 get grades of B- or above.
Some experts think the NAEP standards are too tough and label as not proficient many students who can read, write and do arithmetic. I asked testing expert Gerald W. Bracey about this, and he said "the NAEP achievement levels are wacko."
Peterson and Hess don't address that issue. But at least they give us the same measuring stick for all states. And I suspect that even if NAEP is too high a standard, it is equally likely that 87 percent of Texas fourth graders have not really mastered their math facts.
One of the more useful provisions of the No Child Left Behind law requires states to give the NAEP test to at least some of their students. In the same issue of Education Next, Harvard economist Caroline Hoxby suggests that NAEP become "a national benchmark to clarify state standards." That would be a blow to federalism, of course, but it might also help more students learn.
Here in the Washington area, where I am paid to keep track of how the local schools are doing, Maryland got a C+ and Virginia got a D+ on the Peterson/Hess report card. Maryland reported a 60 percent proficiency in eighth-grade reading, compared to only 31 percent on the NAEP test. Virginia reported 70 percent eighth-grade reading proficiency, compared to 36 percent on the NAEP.
Virginia spokesman Charles Pyle objected to this analysis in the way I suspect most states would that are showing achievement gains. "The percentages of Virginia students in grades 4 and 8 who score at or above the proficient level on the NAEP in reading and mathematics have increased dramatically since Virginia began its reform and are comparable or significantly higher than corresponding percentages in the states that received straight A's," he said.
Maryland State Education Department spokesman Ron Peiffer said: "Maryland tests were designed for No Child Left Behind with the intent of setting standards that would be attainable by all students by 2014. They are rigorous but attainable. NAEP standards were not designed for 100 percent proficiency. Hence, I would expect a bit of dissonance between the two."
The District of Columbia, like 10 states, did not have test data for fourth and eighth graders, the age groups measured by NAEP, so Peterson and Hess, helped by researcher Mark Linnen, could not make a comparison.
Maryland and Virginia have shown significant improvement in their achievement rates, but it would be helpful to know if they are trying to make themselves look even better by dropping a few lead weights on the scale. If this new report card helps keep everybody honest, I am willing to take a look at it, and see what sort of grades are awarded next year.