All of the college admissions community took note last month when leaders of Claremont McKenna College, an elite California liberal arts school, acknowledged their admission dean had inflated the school’s SAT numbers for years.
Much has been made of how this modest bit of fraud might affect Claremont McKenna’s rankings. One ranker, Kiplinger, went so far as to pull Claremont from its “value” rankings. (That seems ironic: does a slightly lower SAT average make the school a lesser value?)
But dropping the school from the list is about the worst penalty a ranker can inflict on a college. What about Claremont McKenna’s accreditor? What about the Department of Education? Claremont McKenna must have reported inaccurate SAT numbers to them, too. Either of those agencies could conceivably inflict real penalties— such as suspension of accreditation, or of student aid — on a school that breaks the rules.
The misdeed attributed to now-former admission Dean Richard Vos was to inflate Claremont McKenna’s annual SAT figures by 10 to 20 points per test section. As President Pamela Gann told campus:
“For the fall 2010 class, which is the most recent year that has been reported generally to the public, the [college] reported a combined median of 1,410 when the actual should have been 1,400, and reported a 75th percentile score of 1,510 when the actual should have been 1,480.”
This apparently went on for several years. And that presumably means inflated SAT data were sent to the Education Department, which publishes consumer-oriented college data on its College Navigator site.
The federal government requires “completion of all IPEDS surveys, in a timely and accurate manner,” of all colleges that receive federal aid.
I have asked Education Department officials whether the college might risk any sanction for submitting inaccurate SAT data. I will append their response when I receive it.
Bad data must also have been included in Claremont McKenna’s periodic internal review documents, submitted to the Western Association of Schools and Colleges to support its case for academic accreditation. Accreditation is the regulatory lifeblood of a college.
Claremont McKenna’s most recent internal review states a school-wide SAT score of 1400 in reading and math for the freshman class of 2008. Based on the school’s recent admission, I assume that figure is padded.
The accreditor’s policy manual includes many references to integrity and accuracy, including this one:
“The institution is committed to honest and open communication with the Accrediting Commission, [and] to undertaking the accreditation review process with seriousness and candor. . .”
The notion that a school of Claremont McKenna’s stature might lose its accreditation over an SAT score is probably absurd. Yet, any falsification raises “an integrity issue,” said WASC President Ralph A. Wolff. Schools have lost their accreditation over fabricated data.
“While it was obviously a completely inappropriate action for them to take,” he said, “we’re also interested in their response.”
College leaders promptly admitted the deed and launched an independent investigation.
“If they had denied it, if they had tried to sweep it under the rug, that would have been far more serious,” Wolff said.
Wolff noted that the falsified data don’t mask any fundamental weakness at Claremont McKenna - - they merely make good test scores look a bit better. And the incident appears to be “the work of one individual.”
There remains another fascinating question: Why?
Why would an admission dean risk the school’s integrity to gain 10 or 20 points on an SAT average? That’s the equivalent or answering one or two more questions correctly on the test. It’s the difference between, say, the 94th and the 95th percentile.
And, no, this tiny fraud would not suffice to push Claremont McKenna higher in the U.S. News rankings. It is not the reason the school finally cracked the U.S. News Top 10 this year among liberal arts schools.
Why, then? Was the dean chasing some internal goal, some arbitrary target set by an ambitious provost or meddlesome board of trustees? Was he driven by compulsion to show upward movement, however slight, from year to year? Was it unhealthy competition with Pomona College, Claremont McKenna’s rival atop the seven-school Claremont consortium?
I put this question to some fellow admission deans; surely they would have some idea. One dean answered on the record, another on condition of anonymity.
“My guess is there is/was some internal pressure he was facing in reporting ‘progress’ in academic quality to the president or the board,” said the unnamed dean, a veteran of the admissions industry.
The dean noted that his Claremont McKenna counterpart apparently reported test scores himself, rather than rely on a separate office of institutional research. Thus, falsifying scores would be comparatively easy: “he could just do a memo with scores sufficiently high to show this progress”.
Rankings were likely “not directly the issue,” he said. If manipulating rankings were one’s goal, an easier way is to just exclude certain students from the average, such as athletes and ‘legacies’, who typically get in with lower scores. Or, go SAT-optional. Students who don’t have to submit SAT scores only bother if the scores are high.
Plenty schools already use those tactics, the dean said. “So, the competition among schools is uneven,” leading to fierce competition — and “to sad stories like Claremont McKenna.”
My second reply came from Henry Broaddus, dean of admission at the College of William and Mary. He spoke on the record.
“This is an especially unfortunate incident, because it arouses even more suspicion and cynicism on the part of the public,” he said. “Admissions of wrongdoing should not be the kind of admissions referred to in our job titles.”
Broaddus portrays the Claremont McKenna episode as “an unfortunate illustration of what can happen when the pressure to deliver results according to a limited set of variables overwhelms one’s commitment to the integrity of the process.
“Although the changes that were made may appear slight, bear in mind that success or failure in this line of work sometimes gets reduced to an up or down arrow. Nuances about degree get lost easily, and if you think I’m overstating that tendency, try to explain why Harvard’s pool being down a mere 2 percent this year commands headlines.”