Christopher L. Eisgruber is president of Princeton University.

My university has now topped the U.S. News & World Report rankings for 11 years running. Given Princeton’s success, you might think I would be a fan of the list.

Not so. I am convinced that the rankings game is a bit of mishegoss — a slightly daft obsession that does harm when colleges, parents, or students take it too seriously.

Don’t get me wrong. I am proud of Princeton’s teaching, research and commitment to service. I like seeing our quality recognized.

Rankings, however, are a misleading way to assess colleges and universities. There are lots of great places to get an education. America’s colleges and universities work collaboratively to educate the wide variety of people seeking degrees. Different schools may suit different students.

For example, Princeton, Columbia, MIT, and the University of California are spectacularly good universities, but they have distinct strengths, structures and missions. The idea of picking one as “best,” as though educational programs competed like athletic teams, is bizarre.

Yet if ranking colleges is a dubious enterprise, it is also a wildly successful one. The U.S. News rankings attract tremendous attention and a huge customer base. Their popularity has inspired many imitators.

None of that would matter if rankings counted only for alumni bragging rights. Applicants and their families, however, rely on the rankings and feel pressure to get into highly regarded institutions. As a result, many schools make intense efforts to move up in the rankings.

This competition produces damaging incentives. For example, some colleges avoid doing difficult but valuable things — such as admitting talented lower-income students who can thrive at college if given appropriate support — in favor of easier strategies more likely to add points in the U.S. News formula.

Still, students and families need comparative information to choose colleges. If rankings mislead, what is the alternative?

For generations, buyers have turned to Consumer Reports for advice about almost everything except college education. When Consumer Reports evaluates a product, it assesses multiple factors so that prospective buyers can make their own choice wisely.

Savvy college applicants likewise need information about some basic variables. Graduation rates are crucial. A college that does not graduate its students is like a car with a bad maintenance record. It costs money without getting you anywhere.

What applicants need is not the average graduation rate, but the rate for students with backgrounds like their own: for example, some places successfully graduate their wealthy students but do less well for lower-income students.

Applicants should also want to see some measure of post-graduation outcomes. The most frequently used yardstick is average salary soon after graduation, which has some value but obvious flaws — students may choose a first job for the fulfillment or the training it provides, rather than to maximize salary. I prefer alumni satisfaction 10 years post-graduation, though that information is harder to gather.

Here is a partial list of other factors that matter: net cost (that is, cost of tuition and fees minus financial aid — again, for students like the applicant); a high-quality faculty actively engaged in undergraduate instruction, including through the individualized supervision of independent work; and a learning culture composed of diverse students who study hard and educate one another.

Judged by these criteria, many schools — public and private, large and small — could be “Consumer Reports Best Buys.” Applicants should be thrilled to get into any of them; they should pick the one they find most appealing; and they should not waste time worrying about which is “the best.”

It would be great to have a Consumer Reports for colleges. Something like it already exists, thanks to the Education Department under the Obama administration. The department’s “College Scorecard” allows anyone to compare colleges on several dimensions, without the distraction of rankings.

Despite its many virtues, the College Scorecard has limitations. Its data-centric interface can make it more attractive to policy wonks than to students. Some of the categories are incomplete or misleading: The earnings data, for example, are drawn from a narrow subset of students and do not accurately reflect the long-term salaries for many fields of study.

James Kvaal, the newly confirmed undersecretary of education, was a leading architect of the original scorecard. I hope that he and the department will upgrade the project and heighten its visibility to students, families and school counselors.

I also hope that some national publication will have the courage to produce an annual, user-friendly Consumer Reports-style analysis of higher education institutions, even if it is not as beguiling as a football-style set of rankings.

In the meantime, those of us who understand the flaws in the rankings must call them out — even when, indeed especially when, we finish at the top.