Why the new SAT scores are meaningless

September 26, 2013
test

The 2013 SAT scores are out and states around the country are either crowing or crying over the results. They shouldn’t expend the energy.

Virginia, for example, is thrilled that students there got the highest scores ever on the exam, and officials are crediting the improvement on school reform. Maryland is unhappy about seeing a drop in scores for the third straight year but you won’t hear officials blaming their reform efforts.

The real question isn’t about why the scores went up or down, but whether or not the results tell us anything valuable about a student’s achievement and abilities. They don’t.

Even David Coleman, president of the College Board, the organization that owns the SAT, has for some time now been bashing his own test and promising that it is going to be substantially rewritten.

The vocabulary portion is silly, he says, because the words are too esoteric for everyday use; the essay is problematic because it doesn’t value accuracy; the math section isn’t focused enough on concepts that matter; and in a recent interview with the New York Times, he said that the newly designed exam will be focused on “things that matter more so that the endless hours students put into practicing for the SAT will be work that’s worth doing.”

What other way to interpret that but to say that now kids are wasting their time with the SAT? Unfortunately, the nine-page report that the College Board issued to explain the “meaning” of the 2013 test scores don’t mention any of this. It says:

The College Board’s 2013 SAT® Report on College & Career Readiness reveals that fewer than half of all SAT takers in the class of 2013 graduated from high school academically prepared for the rigors of college-level course work. This number has remained virtually unchanged during the last five years,underscoring a need to dramatically increase the number of students in K–12 who acquire the skills and knowledge that research demonstrates are critical to college readiness.

If you are someone who believed strongly in the modern school reform movement that places the highest emphasis on standardized test scores as the chief metric of student achievement, you should be concerned that the SAT scores haven’t moved nationally in five years. Neither, for that matter, have the scores on the other college entrance exam, the ACT, which has overtaken the SAT as the most popular admissions exam. The 2013 ACT scores were the lowest they have been in five years.

Bob Schaeffer, public education director of the National Center for Fair & Open Testing (FairTest), a nonprofit dedicated to ending the misuse of standardized test scores, properly noted that proponents of the No Child Left Behind law and the Obama administration’s Race to the Top school reform funding competition said that focusing “accountability measures” on standardized test scores would increase college readiness and narrow gaps in scores between different groups of students. He said:

The data show a total failure according to their own measures. Doubling down on unsuccessful policies with more high-stakes K-12 testing, as Common Core exam proponents propose, is an exercise in futility, not meaningful school improvement.

Those who distrust standardized test scores as a measure of anything important — student promotion from grade to grade, high school graduation, teacher evaluation, etc. — should be no less concerned for the simple reason that colleges and universities continue to make SAT and ACT scores an important factor in admissions. It is true that in the past decade, nearly 100 colleges and universities, including dozens of nationally competitive schools, have stopped insisting on SAT or ACT scores as part of a student’s application and have adopted “test-optional” policies, bringing the number of accredited, bachelor-degree granting institutions that do not require all or many applicants to submit test scores for admissions to more than 800. But most still do.

In this recent piece in U.S. News & World Report, , the president of Ithaca College and a former executive director of the GRE testing program, explains why he decided to opt out of using admissions test scores in making decisions about which students to admit:

Our first realization was that test scores add relatively little to our ability to predict the success of our students. Studies undertaken by the SAT’s sponsor, the College Board, generally indicate that the SAT adds only modestly to the prediction of student success after high school GPA is taken into account. Our internal study showed similar results, validating that the loss of test score information at the time of admission makes very little difference in our ability to identify how successful applicants will later become as college students.

In addition, we know that some potential students are deterred from applying to colleges that require a test score because they are not comfortable taking standardized tests. In fact, groundbreaking research by psychologist Claude Steele, now dean for the School of Education at Stanford University, has shown that underrepresented groups are more likely than others to be put off by test score requirements.

This takes us back to the real question: Why do so many schools continue to use test scores for high-stakes purposes when the scores have very little, if any, meaning?

Valerie Strauss covers education and runs The Answer Sheet blog.
Comments
Show Comments
Most Read Local
Next Story
Valerie Strauss · September 26, 2013

local

answer-sheet