A Report Card With Rare Meaning
Tuesday, June 7, 2005; 10:36 AM
I am sick of report cards. I don't mean the ones I used to get at a school. Fortunately, I no longer remember those. I mean the report cards I receive from well-meaning research groups showing how well each state is doing in some educational endeavor.
They give each state an A, B, C, D or F, usually on something related to the No Child Left Behind law. Are the tests aligned with the lessons? Are the standards specific enough? Are the teacher qualifications tough enough? Are they spending enough money? Are they progressive enough? Are they traditional enough?
I grant you it is fun to give a whole state a D or an F in something. I can't get back at the college professor who (correctly, I regret to say) nearly flunked me in Chinese. But it would give me a pleasant feeling to tell, oh, let's pick a big state, Michigan, that it is doing D work in improving teacher quality.
Yet looking at all those grades makes my head hurt. Each rating system is different. Most are pretty vague. I am never quite sure what they mean, except in one case I just found that deserves wider attention than it is likely to get.
This unusually clear and compelling report card appears in the summer 2005 issue of Education Next , a well-edited, lively and expensive ($7 an issue) journal of opinion and research on education put out by experts at Harvard and the Hoover Institution at Stanford.
The editors of Education Next lean toward the pro-testing, pro-charter school, pro-voucher side of the national education debate, but they often publish intriguing stories that conflict with what some critics might see as their political slant. Their new report card is refreshingly short -- just two pages for the explanation and the list of states. And it seems to expose shortcomings in both those states that are critical of No Child Left Behind and those that are not.
The new report card also helps explain something that has puzzled me for some time. In my research for a book on the KIPP schools, a group of public charter middle schools found mostly in inner cities, I have noticed that the average reading and math proficiency rates for students entering KIPP schools vary widely depending on which state they are in. The KIPP kids are very much alike in all 15 states and the District that have those schools. About 85 percent of them have family incomes low enough to qualify for federally subsidized lunches. Nearly all are African American or Hispanic. They live mostly in urban or rural neighborhoods where school achievement levels are very low.
And yet students entering the KIPP school in Gaston, N.C., have recently had passing rates on their state tests of 53 percent in reading and 75 percent in math, while the new students at the KIPP school in the south Bronx have had passing rates of 30 percent in reading and 43 percent in math on the New York tests. How can that be? The North Carolina schools have shown some improvement lately, but it is highly unlikely their proficiency rates in reading and math are twice as high as those in New York.
Obviously, some states define the word "proficient" differently than others, as the No Child Left Behind law allows them to do. It is easier to get yourself labeled proficient in North Carolina than it is in New York, and that to me is a problem. How are the citizens of different states to know how well their schools compare to schools in other states if the standards are all over the map? Americans move occasionally, and they should not have to scratch their heads about a new neighborhood school that seems very much like their old one in a different state, but claims a proficiency rate 30 percentage points higher.
There is, thankfully, a national test given to primary and middle school children in all states that can be used as a measuring stick for state standards. It is the National Assessment of Educational Progress (NAEP, pronounced 'nape'). Only sample groups of students take the NAEP test in each state, but it is enough to give us a rough idea of how much the state standards diverge from NAEP. The authors of the Education Next report card, Harvard political scientist Paul E. Peterson and American Enterprise Institute education policy studies director Frederick M. Hess, have used that comparative tool to show which states are pumping up their proficiency rates in what some might consider a deceptive way.
Under state standards in Texas, for instance, 87 percent of fourth graders are said to be proficient in math. Under the NAEP standards, only 33 percent of fourth grade Texans are considered proficient. Peterson and Hess say that gap is so large, as measured by a statistical tool called standard deviations, that it earns the Lone Star state an F on their report card. (And that shows that at least in this case politics is not getting in the way. Education Next's editors tend to support much of President Bush's agenda in education, but here they are trashing the home state of the president and both persons who have served as U.S. education secretary during his administration.)
In South Carolina, on the other hand, 31 percent of fourth graders were deemed proficient on the state's reading test, while 26 percent of fourth graders given the NAEP reading test were said to be proficient. That difference is small enough, Peterson and Hess say, to give South Carolina an A on their report card. Of the 40 states they assess, only 11 get grades of B- or above.