Most Read: Local

Answer Sheet
Posted at 03:00 AM ET, 01/09/2012

Putting New York’s testing program on trial

This was written by Fred Smith, a retired New York City Board of Education senior analyst who worked for the city public school system in test research and development. In this post he writes about New York state’s standardized testing program for students. Though his comments are specific to New York, the same types of problems are prevalent in other states as well.

By Fred Smith

Regents Chancellor Merryl Tisch has been promoting New York’s new and (we’ve been assured) improved student standardized testing program, part of her don’t-look-back agenda. What she, the previous test vendor and State Education Department hope to avoid is a thorough investigation of the old unreformed program. We must demand one.

The 2011 National Assessment of Educational Progress results were released late last year. Every two years, NAEP measures reading and math proficiency in grades 4 and 8 and reports on the trends. New York was the only state in the country to lose significant ground in 4th grade math.

NAEP is considered “the gold standard” in testing. Its exams consist of multiple-choice and open-ended items — the bricks each test is made of. Open-ended questions place a greater cognitive demand on students, asking them to interpret reading passages or solve math problems rather than select or guess at the right answer.

In well-constructed tests, students are expected to get a higher percentage of correct answers on the simpler, machine-scored multiple-choice questions. Moreover, performance on the two sets of items should run fairly parallel. Those who do well or poorly on one type of item generally do so on the other.

NAEP exams meet both expectations. Stability and orderliness characterize New York’s NAEP results over time.

The English Language Arts (ELA) and math tests sold to the New York State Education Department by CTB/McGraw-Hill (under a five-year $38 million contract) also consist of multiple-choice and open-ended questions. The only thing reliable about the state’s underlying results, however, has been their unpredictability from one year to the next.

Performance on CTB’s grade 4 multiple-choice math items has diverged from the teacher-scored open-ended items three out of five times since 2006. Inexplicably, there was a crossover in 2008 when the open-ended proved to be easier than the multiple-choice items.

This isn’t the starkest example of inconsistencies in the quirky statewide exams, taken by 1.2 million kids each year. More extreme contrasts appear in other grades, signaling inadequate test development.

On the 2011 ELA, for grades 5, 6 and 7, I found performance on the open-ended items to be increasing as sharply as the multiple-choice results were falling — shooting up 10.7% in Grade 5 to exceed the level to which the latter had sunk. And 2011 was supposedly the year of better testing.

The public isn’t told how students function on each set of items. Instead, the state education department announces composite test results — muddled information that conceals messy contradictions. This is willful deception.

Accounting for results on the open-ended questions should have been a given. They purportedly tapped higher-level thinking and writing ability, took more time to administer on separate days and cost millions to score. Why no breakdowns for our money?

Here’s why. For the state education department to have revealed the zigzag, below-the-surface lines would have exposed the glaring weaknesses in New York’s predominant testing program and served as an indictment of it based on:

1) Results so erratic they provide reasonable evidence CTB’s expensive test instruments were defective — deranged clocks with hands that turn in opposite directions;

2) Relatively easy open-ended items (still 26.2% easier than NAEP’s in 2011, the year CTB’s tests got harder), implicating procedures that have given local scorers, whose districts crave high scores, broad benefit-of-the-doubt latitude to judge student responses favorably; and

3) The self-serving way student achievement and growth have been reported. Where were the Regents for the last decade when education commissioners and state education department officials were issuing incomplete misleading press releases and, along with CTB, covering for each other to defend their flawed exams?

Our children will never get back the formative classroom years stolen from them by a system that continues to value test preparation above learning. We can’t afford to leave the truth behind.

-0-

Follow The Answer Sheet every day by bookmarking http://www.washingtonpost.com/blogs/answer-sheet. And for admissions advice, college news and links to campus papers, please check out our Higher Education page. Bookmark it!

By  |  03:00 AM ET, 01/09/2012

 
Read what others are saying
     

    © 2011 The Washington Post Company