While the results didn’t budge, there was something different this year in the way the exam results were announced. This time, the scores were “packaged” for public release with the results from other tests owned by the College Board — the Advanced Placement high school program and the Preliminary SAT/National Merit Scholarship Qualifying Test scores. (There wasn’t much to celebrate in those test results, either.)
So, why were the scores released together for the first time? According to this story by my Washington Post colleague Nick Anderson, the College Board said it was an effort to “paint a more complete picture of student progress during high school, showing missed opportunities for just-graduated students and areas where action can be taken to improve student outcomes for those still in high school.”
Makes sense. Makes you wonder why they didn’t do this before.
Anderson suggests a reason for that. The SAT once reigned supreme among college admissions exams, for many years leaving the ACT in the dust in terms of participation. But in 2012, the ACT for the first time was taken by more students than the SAT, and since then the gap has widened in ACT’s favor. There was virtually no difference in the number of students who took the SAT from 2013 to 2014, but the ACT had a nearly 3 percent increase in the number of test takers over the same period. (See chart below.)
If you had to announce something but the news wasn’t good, what would you do? Perhaps try to take the focus off it with other news and try to link the weaker product (the SAT, for example) with stronger ones (the PSAT and AP, for another example)?
But there’s more to wonder about, says Bob Schaeffer, public education director of FairTest, the National Center for Fair & Open Testing, a nonprofit organization dedicated to ending the misuse of standardized tests. Schaeffer said in an e-mail that he suspects “another factor at work in this year’s College Board announcement was an attempt to downplay the continued large gaps in SAT scores by race, gender and income.” He said in an e-mail:
For the first time ever, the College Board did not immediately release the College-Bound Seniors Total Group Profile Reports for the national and individual states, which provide a data-rich source for many articles about score trends and distribution patterns. In fact, they did not post the data tables until pressured to so so by a number of journalists, particularly the Post’s Nick Anderson. Of course this detail was not made available until most stories about this year’s score release had been written and filed.
Schaeffer also noted that school reformers who have made high-stakes standardized tests the most important metric in evaluating students, teachers and schools through programs such as No Child Left Behind and Race to the Top have insisted that a focus on testing would narrow the achievement gap and boost the percentage of students ready for college. This approach has failed, he said:
SAT score trends show a total failure, according to their own measures. Scores have declined since 2006 for every group except Asians. Doubling down on unsuccessful policies with more high-stakes K-12 testing, as Common Core exam proponents propose, is an exercise in futility, not meaningful school improvement. Nor will revising the SAT, as currently planned, address the nation’s underlying educational issues.
Besides all of this, there is the issue of just what the SAT and the ACT really tell us about the students who take it. Schaeffer and others say not much, if anything. For one thing, average SAT scores are correlated with family income; scores increase with every $20,000 in additional family income, my colleagues Lyndsey Layton and Emma Brown reported here.
Despite protests by the College Board and ACT, the organization responsible for the ACT exam, both exams are coachable. And it is clear that high school grades are far more predictive of college success than these tests. The 2009 book, “Crossing the Finish Line: Completing College at America’s Public Universities,” co-authored by former Princeton President William Bowen, found that:
* High school grades are a far better incremental predictor of graduation rates than are standard SAT/ACT test scores.
* Overly heavy reliance on SAT/ACT scores in admitting students can have adverse effects on the diversity of the student bodies enrolled by universities.
* The strong predictive power of high school GPA holds even when we know little or nothing about the quality of the high school attended
Despite this, too many colleges and universities will put enormous weight on SAT and ACT scores when making admissions decisions. But even more troubling is what this article in the Wall Street Journal last March reported:
Proving the adage that all of life is like high school, plenty of employers still care about a job candidate’s SAT score. Consulting firms such as Bain & Co. and McKinsey & Co. and banks like Goldman Sachs Group Inc. ask new college recruits for their scores, while other companies request them even for senior sales and management hires, eliciting scores from job candidates in their 40s and 50s….….A low score doesn’t necessarily kill a person’s chances, hiring managers say; instead, they say they believe SATs and other college entrance exams like the ACT help when comparing candidates with differing backgrounds or figuring out whether someone has the raw brainpower required for the job.
Companies that do this must really think that a single score on a test someone took on a single day when they were a teenager (and hopefully when they weren’t sick, anxious or exhausted) can reveal raw brainpower decades later. It doesn’t sound like a lot of brainpower went into coming up with that kind of hiring tool.
So here we are again, with newly released SAT scores that don’t really tell us anything about the students who took them but that will have an important effect on many of those students’ lives anyway.
There’s something very wrong here.
FairTest National Center for Fair & Open Testing
UNIVERSITY ADMISSIONS TEST TAKERS 1986 – 2014
High School ACT SAT ACT Test-takers/
Class of: Test-takers Test-takers SAT Test-takers
1986 730,000 1,000,748 72.9%
1987 777,000 1,080,426 71.9%
1988 842,000 1,134,364 74.2%
1989 855,171 1,088,223 78.6%
1990 817,000 1,093,833 74.7%
1991 796,983 1,032,685 77.2%
1992 832,217 1,034,131 80.5%
1993 875,603 1,044.465 83.8%
1994 891,714 1,050,386 84.9%
1995 945,369 1,067,993 88.5%
1996 924,663 1,084,725 85.2%
1997 959,301 1,127,021 85.1%
1998 995,039 1,172,779 84.8%
1999 1,019,053 1,220,130 83.5%
2000 1,065,138 1,260,278 84.5%
2001 1,069,772 1,276,320 83.8%
2002 1,116,082 1,327,831 84.1%
2003 1,175,059 1,406,324 83.6%
2004 1,171,460 1,419,007 82.6%
2005 1,186,251 1,475,263 80.4%
2006 1,206,455 1,465,744 82.3%
2007 1,300,599 1,494,531 87.0%
2008 1,421,941 1,518,859 93.6%
2009 1,480,469 1,530,128 96.8%
2010 1,568,835 1,547,990 / 1,597,329* 101.3% / 98.2%*
2011 1,623,112 1,647,123* 98.5%
2012 1,666,017 1,664,479* 100.1%
2013 1,799,243! 1,660,047* 108.4%
2014 1,845,787! 1,672,395* 110.4%
* Once it saw that the number of ACT-takers was larger, based on a historically consistent measure, the College Board revised the number taking the SAT upward by including more exam administrations
! Prior to 2013 ACT reports did not include test-takers who had been granted accommodations, such as extra time
source: ACT and College Board annual “College Bound Seniors” reports since 1991. U.S.Department of Education, Nat’l. Center for Educational Statistics data for earlier years. Calculations by FairTest.