wpostServer: http://css.washingtonpost.com/wpost2

Most Read: Local

Answer Sheet
Posted at 06:20 PM ET, 09/13/2011

The problem with the U.S. News college rankings

Here’s the problem with the annual U.S. News & World Report college rankings: the rankings.

The magazine just published its 2012 lists, as it does every year. And every year officials at schools bend over backward to find ways to look better in the eyes of the rankers while millions of high school students scour the lists and take them at face value.

Somehow, many believe, that by some twist of the data, Harvard and Princeton universities, which for 2012 share the No. 1 slot for National universities, are just slightly better than No. 3 Yale University, and that Northwestern University, ranked 12th, is really slightly better than No. 13, Johns Hopkins University.

It’s nonsense.

My colleague Dan DeVise wrote today on his Post blog, College Inc., about the newly published 2012 rankings. After the top three schools mentioned above on the National University rankings are: No. 4, Columbia University; and tied for No. 5, the California Institute of Technology, Massachusetts Institute of Technology, Stanford University, the University of Chicago and the University of Pennsylvania. That leaves Duke University at No. 10.

None of this would be worth writing about if people didn’t take the rankings so seriously, but they do, so it’s a good idea to remind readers of this:

The largest factors in the rankings — worth 22.5 percent for National Universities — is the combined assessment of a school’s reputation by academics from rival institutions and by high school college admissions counselors. For National Universities, peer assessment is worth 15 percent (it used to be 25 percent) plus 7.5 percent for high school counselors.

The magazine Web site explains it this way:

The U.S. News ranking formula gives significant weight to the opinions of those in a position to judge a school's undergraduate academic excellence. The academic peer assessment survey allows top academics — presidents, provosts, and deans of admissions — to account for intangibles at peer institutions such as faculty dedication to teaching.

So presidents and provosts, who have plenty to do each day on their own campuses, are supposed to know enough to rate the intangibles at peer institutions such as faculty dedication to teaching? The leaders I have talked to over the years tell me it is hard enough to be sure what is going on in their own institutions, much less their rivals’, at least not across the board and well enough to give a fair rating.

Bob Morse, the man who is in charge of assembling the rankings at U.S. News, said in an e-mail that the high school counselor reputation variable was added last year to give high school counselors a voice in the rankings. Morse didn’t say so, but it could be an acknowledgment that college presidents and other leaders aren’t enough to assess quality.

There are many people who say that the data used in the national university rankings are useful and do help prospective students assess the quality of a school.

The data, according to the magazine Web site’s statement on its methodology, include retention of freshmen and students overall, 20 percent; faculty resources, 20 percent; student selectivity, 15 percent; financial resources, 10 percent; graduation rate performance, 7.5 percent, for national universities and national liberal arts colleges only; and alumni giving rate, 5 percent.

A survey released earlier this year revealed just how wary many college admissions counselors are of the rankings.

The survey, conducted by the National Association for College Admissions Counseling, showed that most counselors surveyed don’t think the rankings accurately represent information about the schools.

And they take issue with the title of the rankings, “America’s Best Colleges,” saying that it begs the question of “best for whom,” when the truth is that different colleges are best for different students.

The survey also found that many counselors were concerned about some of the core measures of quality used by the magazine, including the peer assessment of academic quality and student selectivity.

Measures such as faculty resources and financial resources don’t rate especially high with counselors either as measures of quality. Faculty salaries, for example, accounts for 35 percent of the faculty resources measure. Though this does show that a school has more money to pay its faculty, a higher salary is no indication of the quality of a college professor.

There is, of course, a lot about a college experience that cannot be reduced to numbers, and that, too, is missing from the rankings.

Parents and students should think carefully about using this as a guide to college quality.

Follow The Answer Sheet every day by bookmarking http://www.washingtonpost.com/blogs/answer-sheet. And for admissions advice, college news and links to campus papers, please check out our Higher Education page. Bookmark it!

By  |  06:20 PM ET, 09/13/2011

 
Read what others are saying
     

    © 2011 The Washington Post Company