When the nation’s main hospital accreditation group released a list of top-performing hospitals last year, Johns Hopkins got a nasty surprise: It wasn’t included.
Instead, Holy Cross Hospital in Silver Spring — a much smaller institution that lacks the global reputation of the Baltimore-based behemoth — was the only area hospital to make the cut.
Shocked and embarrassed, Hopkins executives and trustees vowed to go all-out to make the list this year. The chief executive presented progress reports at each board meeting. Staff members consulted with Holy Cross employees to get tips to improve Hopkins’s performance.
But on Wednesday, when the Joint Commission released this year’s report, Hopkins again failed to earn a spot. Holy Cross did, and two small local facilities were named for the first time: Reston Hospital Center and Civista Medical Center in La Plata.
“Of course, we’d like to perform well on these measures,” said Hopkins’s Peter Pronovost, senior vice president of quality and patient safety. But, he said, other factors also are important for assessing the overall quality of care, such as outcomes and patient satisfaction.
Hopkins’s frustration underscores the sometimes perplexing nature of hospital performance-rating systems. Such grades are proliferating, but they stress different data, making it hard for consumers to get an accurate view of a hospital’s overall performance.
For example, the commission’s report, which named 620 hospitals across the country as “top performers,” focused on key quality measures, including whether hospitals gave aspirin to heart attack patients or provided anti-stroke medications to stroke patients at discharge. As such, the report looked at processes used, rather than outcomes such as patient death or readmission rates.
The commission didn’t look at Hopkins’s claims to fame — cutting-edge research and the use of the latest medical technologies and treatments. Other well-known research institutions such as Massachusetts General Hospital and the Cleveland Clinic also failed to make the list, even though they, like Hopkins, regularly show up on other top-hospital lists, such as the one by U.S. News & World Report.
Mark R. Chassin, president of the commission, strongly defended the quality measures it used, saying evidence shows they improve the outcomes for patients. Many of the gauges are used by the federal and state governments, as well as private insurers, in pay-
“The report is not a ranking of hospitals; it is not based on unscientific data such as reputation,” Chassin said. But the specific actions that it measures add up to millions of opportunities “to provide the right care to patients at American hospitals,” he said.
The commission, which is based outside Chicago, issued the list for the first time last year. It uses a complex methodology to calculate the top performers. Generally, hospitals had to score at least 95 percent on each of a host of quality measures to earn a top-performer rating. The data were based on hospital records for 2011.
Pronovost said some Hopkins doctors take issue with the commission’s recommended protocol for giving pneumonia vaccine, a measure for which Hopkins failed to get a high enough score.
“Some of them don’t believe it’s good medicine to give the pneumonia vaccine if you have cancer,” he said.
The 620 top performers represent about 18 percent of the country’s more than 3,300 accredited hospitals that report the relevant data and are not ranked in any particular order. Many of the top performers were small community hospitals rather than big academic medical centers.
One reason, Chassin said, is that large hospitals tend to have more patients with more complicated conditions. For those facilities, he said, “It is more difficult to achieve this kind of consistent excellence.” Still, he said, those institutions also have more resources to devote to improving their performances.
Other Washington area hospitals that failed to make the top performers list either year included MedStar Georgetown University Medical Center, MedStar Washington Hospital Center, George Washington University Hospital, Suburban Hospital and Inova Fairfax Hospital.
Executives at those facilities said they improved in many areas and would work harder to do even better. Some of the hospitals fell short on a single measure by less than one percentage point. Other scores were well below 90 percent on an important gauge.
Many local hospitals scored lowest on consistently providing appropriate pneumonia and flu vaccines.
Smaller hospitals have time to have their staff review patient charts in real time to make sure everything gets done, Pronovost said. “If I have 10 beds, it’s not that much work.” he said. But for a large hospital such as Hopkins, administrators have to decide whether it makes sense to have “six nurses do that or are there better ways to spend the time of six nurses.”
Still, he said, Hopkins would adopt some real-time review of charts for measures on which the hospital didn’t perform well.
Holy Cross’s position on the list has boosted the profile of the 448-bed facility, which is part of Trinity Health, a Catholic hospital system based in Michigan. Other hospitals, locally and elsewhere, have come looking for “our secret sauce,” said Yancy Phillips, the hospital’s vice president of quality and care management. Inova Fairfax, for example, is modeling some of its practices after Holy Cross, executives said.
Phillips credits three factors for the hospital’s success: intensive review of patients’ charts, an electronic records system and a leadership focus on quality. The hospital has four staff members whose main job is to scour charts, often virtually through the hospital’s electronic medical records system, to make sure all necessary steps have been taken.
The hospital also put in place a daily noon teleconference during which doctors, nurses and nurse managers go over the cases of patients whose treatments are being measured.