The Washington PostDemocracy Dies in Darkness

Opinion Med school leaders: Why we’re not participating in the U.S. News rankings anymore

Two medical school leaders had been frustrated for years with U.S. News & World Report’s annual ranking of graduate schools.

This year, they decided to act on it.

The end of college rankings as we know them

Art Kellermann is dean of the Hébert School of Medicine at the Uniformed Services University of the Health Sciences in Bethesda, Md., the leadership academy for America’s military health system and the U.S. Public Health Service. Before joining USU in 2013, he was at Rand Corp. and Emory University, where he was founding chair of the Department of Emergency Medicine and established the Emory Center for Injury Control. He is a member of the National Academy of Medicine.

One medical school’s ultimate test carries all the chaos of the battlefield

Charles Rice is president of the Uniformed Services University of the Health Sciences, and advises the assistant secretary of defense for health affairs and the four surgeons general on issues related to graduate health-professions education and medical research. A trauma surgeon and former captain in the Navy Medical Corps, he chaired the board of the Accreditation Council on Graduate Medical Education from 2002 to 2004.

Their essay, which first ran in Health Affairs, explains why they took this step. — Susan Svrluga

By Arthur Kellerman and Charles Rice 

Every spring, the snows recede, birds migrate north, and U.S. News & World Report releases its annual “Best Graduate Schools” rankings. The issue is a predictable hit with prospective graduate students and anxious parents who want to make sure their child gets into the “right” school.
Universities that do well amplify the buzz by boasting of their ranking in ads, articles, and campus banners. The hoopla ensures that the issue is an annual moneymaker for the magazine.
Much of the data U.S. News uses to generate its rankings is provided by the schools themselves. A few months ago, when we received the magazine’s annual request for information, we decided that the Hébert School of Medicine at the Uniformed Services University (USU) would no longer play along.
Given the powerful hold U.S. News’ rankings have on public opinion and the views of applicants to colleges and graduate schools, some might question why we took this step. Our answer is simple. After scrutinizing what is known about the process, we concluded that continued participation is a disservice to medical school applicants.
The flaws in U.S. News’ approach are well known in academic medicine.
Fifteen years ago, two academic researchers conducted a comprehensive analysis of U.S. News’ approach. They concluded that “the annual U.S. News and World Report rankings of American medical schools are ill-conceived; are unscientific; are conducted poorly; ignore medical school accreditation; judge medical school quality from a narrow, elitist perspective; and do not consider social and professional outcomes in program quality calculations. The medical school rankings have no practical value and fail to meet standards of journalistic ethics.”
Rather than address the criticism by improving its methodology, U.S. News ignored the study.
The public ignored it, too.
Five years ago, a panel of medical school deans echoed these concerns at a conference U.S. News held to recognize the 20th anniversary of its medical school rankings. Robert Alpern, dean of the Yale University School of Medicine, remarked:
“I think what’s frustrating everybody … is that there’s nothing really in [U.S. News’] formula that is really evaluating the quality of medical education. That would be so much more useful to the applicants, to the students. And it would incentivize us to do a better job in education.”
Many other medical school deans have told us that even though they share Alpern’s concerns, they feel compelled to participate.
Schools have a perverse incentive to boost their rank at the expense of applicants and the public.
Based on the methodology used by U.S. News, a medical school that wants to boost its rank should heavily favor applicants with super-high MCAT scores and grade-point averages and ignore important attributes such as character, grit, and life experiences that predict that a student will become a wonderful doctor.
A school might also encourage applications from large numbers of people with little or no chance of acceptance simply to boost its “selectivity” score.
A school’s rank is also heavily influenced by a vague sense of “reputation.” Forty percent of each school’s score is derived from two annual surveys: one sent to medical school deans and top administrators, the other to residency program directors. Both groups are asked to rate the quality of every medical school from one (marginal) to five (outstanding).
Because none of us can fairly score all of our peer institutions, more than two-thirds of medical school deans and an even higher percentage of residency program directors toss the survey in the trash.
The body that accredits U.S. medical schools, the Liaison Committee on Medical Education or LCME, has stringent standards. Our school was recently reaccredited, which required submitting more than 270 pages of detailed data about our program, a comprehensive self-study, an independent, student-led analysis of their medical school experience, and a three-day site visit to our campus by a team of professional educators.
If a medical school is LCME-accredited, it’s more than good; it’s excellent. Yet the U.S. News rankings ignore this accreditation.
They also do not consider important qualities that distinguish medical schools from each other. All U.S. medical schools are required to meet the same accreditation standards, but we differ in terms of educational philosophy and emphasis.
Some schools aspire to produce biomedical researchers, while others turn out practitioners. Some graduate disproportionate numbers of specialists and sub-specialists, while others focus on producing primary care physicians.
USU illustrates the limitations of U.S. News’ “one-size-fits-all” approach. Because we serve as the leadership academy for military health in the United States, our curriculum is unique. We not only provide the same high-quality education that civilian medical schools teach, our students receive 700-plus hours of additional instruction in military-relevant topics such as combat casualty care, tropical medicine, global health, ethics and officership.
Medical schools also vary with respect to tuition, fees and the availability of financial aid. Cost matters.
In 2014, the median educational debt of an American medical school graduate exceeded $180,000.
Remarkably, although the Uniformed Services University is the only medical school in America that charges no tuition or fees, we didn’t appear in U.S. News’ 2014 list of the 10 least expensive public medical schools, or the magazine’s latest ranking of public medical schools that offer the most financial aid.
There’s a better way to compare medical schools. Prospective applicants should consult the Medical School Admission Requirements® (MSAR®) database, a service supported by the nonprofit Association of American Medical Colleges (AAMC).
MSAR’s online tool not only provides detailed information about every accredited U.S. and Canadian medical school and BS/MD program; it lets users search, sort, and compare schools based on their personal priorities. Full access to the database requires a modest $25 fee, but it’s worth the expense.
Despite these shortcomings, U.S. News is unapologetic. At its 2011 conference held to commemorate the 20th anniversary of the medical school rankings, U.S. News’ editor declared, “The fact of the rankings is … they are here to stay. This is a consumer-driven need.”
We agree that consumers need information, but the information should be meaningful.
A recently posted blog on the magazine’s website noted that “Medical School rankings are one indicator of an institution’s perceived quality.”
The key word in this sentence is “perceived.”
In our view, a decision as important as choosing where to apply to medical school should be based on more than perceptions — it should be based on reality.
Moreover, the information each applicant receives should be objective, reliable, and relevant to his or her personal needs and aspirations.
That’s why the Hébert School of Medicine at the Uniformed Services University will no longer participate in U.S. News’ annual survey.
We hope that more medical schools will follow.
Authors’ Note
Our views are our own and do not necessarily reflect those of the University, the U.S. Department of Defense, or the U.S. Government.

College rankings that really matter: Hottest students, most weed, ugliest campus

Here is a response from Robert Morse, chief data strategist for U.S. News:

We wish that Dr. Kellerman and Dr. Rice had reached out to us directly about their concerns. It is our policy to meet with medical school deans and faculty who have suggestions about improving the Best Medical Schools rankings for prospective students. We have conducted many such meetings. We value their input and take their feedback seriously, but it is important that we correct two facts. First, per the recommendation of a group of medical school deans, we changed the way we account for undergraduate GPAs and MCAT scores so that medical schools don’t have any incentive to favor applicants with super-high MCAT scores and GPAs. This change has been in place for the past three editions of the Best Graduate Schools rankings. Second, as noted in our methodology [] accreditation is a key component of our rankings. All of the 140 medical schools in the Best Medical Schools rankings are fully accredited in 2015 by the Liaison Committee on Medical Education.