Page 2 of 2   <      

No Single Explanation For Md. Test Score Bump

Recent reports from the Thomas B. Fordham Institute and Fuller's group, Policy Analysis for California Education, have concluded that most recent gains on state tests are illusory, reflecting better test-taking skills or lower standards rather than increased knowledge. Another study, from the Center on Education Policy, concluded that the gains seemed genuine but did not necessarily reflect greater learning.

The reports compared state results with other tests, such as the National Assessment of Educational Progress and Measures of Academic Progress, which suggest that academic skills are improving at a slower rate, if at all.

"When the arrows don't point in the same direction, you have to at least ask yourself what's going on here," said Chester E. Finn Jr., Fordham Institute president.

Maryland's controversy came as schools reveled in a sense of collective achievement after six years of testing. The overall pass rate on the exams, given in grades 3 through 8, reached 82 percent in reading and 76 percent in math this year, more than 20 points higher than proficiency rates in 2003.

Until this year, Maryland's tests comprised two distinct sections. One was a series of multiple-choice and written items -- about 35 in reading and twice that many in math -- that measured students against state standards. The other was a series of purely multiple-choice items culled from commercial tests, administered to yield nationally normalized percentile scores for Maryland students. The state deleted the second section, judging it irrelevant because the questions were not derived from Maryland's curriculum.

An independent panel of psychometricians validated the revised test, affirming that it was neither easier nor harder than last year's, although the group spent a few minutes in a conference call pondering unusually large gains in certain grades. (The share of fifth-grade students rated "advanced" in reading, the highest of three performance levels, rose an unprecedented 18 points.)

Some panel members say the changes might have contributed to the higher scores. How a student performs on a test item depends partly on what comes before and after, factors that could affect concentration or confidence.

"So we have to ask . . . 'What are the effects of context on student scores?' " William Schafer, a University of Maryland professor emeritus who is on the panel, wrote in an e-mail. "Not much is known in the public literature about that question."

When Grasmick and Peiffer reviewed the scores a week before their release, neither questioned the gains. The results made sense, Peiffer said: They were driven by large increases in historically low-performing Prince George's County and Baltimore, systems with dynamic leaders and well-documented reforms.

"It's not as if these are results we weren't expecting," Peiffer said.

Nonetheless, some in the research community are pushing for an overhaul of the national testing apparatus. A simple fix: Require states to announce any change that might affect scores. A more radical solution: a national test, immune to state manipulation.

"I think most people are trying to do the right thing," said Jack Jennings, president of the Center on Education Policy. "But the pressure to get results is enormous, and some people fail. Some people sin."


<       2

© 2008 The Washington Post Company