HIGHER ED BLOGS
· College Inc.
· Campus Overload

Higher Education

Your essential guide to college life & higher education news

Correction to This Article
An Oct. 24 Metro article about educational testing should have stated that consensus is building among officials that "proficient" on state tests more closely resembles "basic" on the National Assessment of Educational Progress, rather than the reverse.

State Gains Not Echoed In Federal Testing

Network News

X Profile
View More Activity
By Daniel de Vise
Washington Post Staff Writer
Monday, October 24, 2005

In Maryland and Virginia public schools, statewide exams are a cause for perpetual celebration. Scores go up almost every year in virtually every grade level and subject tested. On the Maryland School Assessment this year, scores rose in all 24 school systems.

But on another test, the only one given by the federal government to public students nationwide, scores tell a different story. According to the National Assessment of Educational Progress, Maryland students have improved their proficiency since 2003 in just one area, fourth-grade math. Virginia scores are up, but not by much, and eighth-grade reading performance has stalled.

The anemic results from the nationwide test, released Wednesday, provide a sharp contrast to the dramatic gains reported by Maryland and Virginia on their statewide exams, required under the federal No Child Left Behind law. To critics of those efforts, it's further evidence -- along with comparatively flat SAT scores, graduation rates and other measures -- that public education is not improving in an era of high-stakes testing.

"SAT participation is not up in Virginia. More kids are not going to college in Virginia," said Mickey VanDerwerker, a parent and school board member in Bedford County, Va., who leads a group critical of the state's Standards of Learning exam. "But if you look just at the SOL results, if you make yourself a nice little line graph, it looks like achievement is just going through the roof."

The national test, which dates to 1969, is designed to measure how students are doing over the long term. It shows a pattern of slow, halting progress stretching back to the Nixon administration.

Maryland's statewide test dates only to 2003, Virginia's to 1998. Both have shown stunning gains in student achievement, driven in part by a relentless classroom focus on bringing up scores. The trend holds less true for D.C. schools, which elected to test students on the commercially published SAT-9 exam rather than create one; the District will introduce its own exam this spring.

The mostly flat results of the latest national test barely resemble the findings of most state exams, which, by their design, tend to yield dramatic gains for a few years, level off, then vanish, eclipsed by a newer, better test.

The upward trend in test scores "says that my kids are learning to take a test, and that does not necessarily mean they're getting a better education," said Sue Allison, a Lusby parent who leads a Maryland group opposed to high school exit exams and other aspects of high-stakes testing.

It's not a phenomenon limited to the Washington region. The Thomas B. Fordham Foundation, a champion of high-stakes tests, looked at eighth-grade reading scores on 29 state tests and found that two-thirds -- 19 states, including Virginia and Maryland -- reported gains in the past two years. None of those 19 states showed progress this year in eighth-grade reading proficiency on the national test.

"As we look at those numbers, we wonder whether or not the progress being reported at the state level is for real," said Michael Petrilli, vice president of the Fordham Foundation. "Are states subtly making their tests easier in order to make their scores look better?"

Education officials in Maryland and Virginia said it's natural for scores to rise more quickly on statewide tests than on the national assessment. Material tested in the Maryland School Assessment and the Standards of Learning exams is drilled into students. Preparing students for the state tests is a singular focus of teachers. Under No Child Left Behind, schools reap rewards if they do well on the tests, penalties if they do not.

By contrast, schools and teachers have little motivation to prepare for the national assessment. Results aren't reported for individual students or for most school systems. The national test is somewhat out of synch with local lesson plans. A fourth-grader in Maryland who takes the national test may face a question that has not yet been asked or answered in any Maryland school -- or one that was covered two years earlier.

The national assessment "doesn't test state standards, which is how we're judged on No Child Left Behind, which is how our systems are judged," said Bill Reinhard, spokesman for the Maryland State Department of Education. "That's basically where everything is going. Our teachers are told to put their focus on the Maryland School Assessment."

But opponents -- and even many advocates -- of high-stakes testing say the national assessment should corroborate, not refute, the trends of statewide tests.

"I always viewed the NAEP assessment as a very good second opinion," said Charles E. Smith, executive director of the National Assessment Governing Board, which oversees the test. "If you look over a three- or four-year period, and you see the trend lines going in different directions, that's a bad sign."

The lack of clear progress on the national test isn't all that concerns education leaders. The new scores also paint a far bleaker picture of overall student abilities than most statewide exams, including those in the Washington region.

Roughly one-third of Maryland and Virginia students rated "proficient" on most sections of the national assessment. But in the latest rounds of statewide testing, most categories of Virginia students scored proficiency rates between 70 and 90 percent. Proficiency rates in Maryland surpassed 50 percent across the board. All of the exams share the three-tiered performance scale of basic, proficient and advanced.

Consensus is building among officials that "proficient" on the national assessment more closely resembles "basic" on the state tests. Virginia's SOL exams "have always been meant to be a floor and not a ceiling," said Charles Pyle, spokesman for the state Education Department.

But such characterizations invite criticism of the state tests: Are they too easy? Is the bar too low? Said Petrilli of Fordham: "What we are seeing looks like a race to the bottom, where states are defining proficiency down."


More in Education Section

[Michelle Rhee]

Michelle Rhee

Full coverage of D.C. Schools Chancellor.

[Fixing D.C.'s Schools]

D.C. Charters

Learn about every charter school in D.C.

[Class Struggle]

Class Struggle

The latest on education from columnist Jay Mathews.

© 2005 The Washington Post Company

Network News

X My Profile
View More Activity