These and similar revelations in the past year have come from Claremont McKenna College in California, Emory University in Atlanta and George Washington University in the District. In each case, the highly regarded schools acknowledged that they had submitted incorrect test scores or overstated the high school rankings of their incoming freshmen.
At a time of intense competition for high-achieving students, the episodes have renewed debate about the validity of the U.S. News rankings, which for three decades have served as a kind of bible for parents and students shopping for colleges.
Much of the information colleges present about themselves to U.S. News, other analysts and the federal government is not independently verified. That makes it impossible to know how many might have misreported data over the years as they angle for prestige to stand out in a crowded market.
“Rankings have become omnipresent in higher education, and they have enhanced the competition among institutions,” said Terry W. Hartle, senior vice president at the American Council on Education, which represents university presidents. “And in any highly competitive environment, there is always a temptation to cut corners.”
In some of the recent cases, college officials said an employee intentionally submitted inaccurate data. In others, it was unclear whether the mistake was intentional. GWU attributed its errors to a flaw in data-reporting systems that dated back a decade.
A survey of 576 college admissions officers conducted by Gallup in the summer for the online news outlet Inside Higher Ed found that 91 percent think other colleges had falsely reported standardized test scores and other admissions data. A few said their own college had done so.
“There’s definitely a widespread feeling that this goes well beyond those that have been caught or come forward,” said Scott Jaschik, Inside Higher Ed’s editor.
U.S. News Editor Brian Kelly said the number of schools that have corrected their record is “a pretty small universe,” which he considers a sign that reporting problems are not pervasive. He said he would not be surprised if a few more cases emerged.
“If it was a stampede, I would be surprised,” Kelly said, “and that might cause us to rethink some things.”
Kelly acknowledged that a string of revelations from five prominent colleges is unusual. But he said the disclosures should strengthen consumer confidence in U.S. News rankings because they show that schools take the data seriously.
“These are institutions that teach ethics,” Kelly said. “If they can’t keep their own house in order, they’ve got a problem. It’s their problem, not my problem.”
The U.S. News rankings, a major force in higher education since the 1980s, sort colleges and universities into various lists, such as best in the nation, best in a region and best value.
The rankings are based on complex formulas that U.S. News invented and that it tweaks from time to time. Inputs include surveys of college leaders and college counselors, as well as statistics on graduation rates, class size, faculty salaries, alumni giving and admissions test scores. U.S. News says these formulas help consumers get information they need and want before they choose a school.
Critics contend the rankings are highly subjective and give students a misleading sense that the college experience can be boiled down to numbers. Some colleges refuse to participate in U.S. News surveys — and receive rankings anyway.
U.S. News has said that 92 percent of 1,391 ranked colleges and universities returned its surveys last year. Some colleges have declined to participate because they say the rankings are counterproductive.
“We just don’t want to play their game and fill out their forms,” said Christopher Nelson, president of St. John’s College in Annapolis. He said he couldn’t care less that his school is No. 133 on a U.S. News national liberal arts list. “I’d rather be in a place that’s unranked.”
Claremont McKenna and Emory, both ranked highly on U.S. News lists, revealed last year that they overstated admission test scores and other data related to incoming students that made them appear more selective. GWU said it had overstated the high school class rank of its students, leading U.S. News in November to strip the D.C. school of its ranking, which had been 51st among national universities.
Tulane’s discovery of missteps came in December. The dean of the university’s Freeman School of Business, relatively new to his position, alerted top university officials about possible misreporting of data. They hired the law firm Jones Day to investigate.
That review found that the statistical profile of full-time students in the master’s of business administration program had been wrongly reported from 2007 to 2011. Average Graduate Management Admission Test scores had been “falsely increased” by an average of 35 points on an exam that has a maximum score of 800, the review concluded, and the number of completed applications had been exaggerated to make the school look more selective than it was.
Tulane said the evidence implicated a former business school employee whom it would not identify. “This was not inadvertent,” Tulane Provost Michael A. Bernstein said. “It was a goal-oriented manipulation.”
The business school, which U.S. News had ranked 43rd in the nation for full-time MBA programs, is now unranked. Bernstein said such incidents are especially painful for all of the university officials, students and faculty who are committed to academic honesty.
“When you discover an error, if you discover a lack of integrity, you’ve got to put a bright light on it and clean it up,” Bernstein said.
At Bucknell, a new vice president for enrollment management sounded an alarm recently about admissions statistics. That led to an internal review and public disclosure Jan. 25 that SAT scores had been overstated by an average of 16 points — on a 1600-point scale for math and critical reading — from 2006 through 2012.
When he learned of the problem, Bucknell President John C. Bravman said, “it was like getting emotionally punched in the gut.”
Like Tulane, Bucknell said the episode traced to the actions of a former employee. Bravman said he spoke by telephone with that person recently but ended the conversation in frustration because he didn’t get to the bottom of the problem.
“I’m not satisfied that I know what happened, and I’m not going to make something up for you,” Bravman said in an interview, adding: “I’m an engineer. To me, accuracy and precision are both important. On that quantitative measure, this failed.”
U.S. News higher education analyst Robert Morse, who oversees the college rankings, wrote on a blog Monday that Bucknell’s misstep was not significant enough to affect its position, 32nd among national liberal arts colleges.
Bravman, like his peers at the other four schools, said new controls will be instituted to ensure the problem does not recur.
All five reported cases in the past year occurred at private institutions. William E. Kirwan, chancellor of the University System of Maryland, said that does not mean public universities are immune from data pitfalls. But Kirwan said state oversight and public records laws provide “some safeguard” for the reliability of the statistics. Public universities, he said, “are used to a level of scrutiny and accountability.”
Raymond A. Brown, dean of admission at the private Texas Christian University in Fort Worth, said there is a simple way for schools to ensure data accuracy: Hire an auditor. Brown said he has done that with TCU admission statistics for the past decade, a practice apparently unusual for colleges.
“It makes me able to sleep a little easier at night,” Brown said. “Kids and families need to be able to rely on the information we’re putting out. It is truly that simple. If [the information] we’re producing is garbage, that’s not helping anybody.”