Hospitals Check Their Charts

Network News

X Profile
View More Activity
By Steven Pearlstein
Friday, April 20, 2007

To find the best hospital in the Washington region, you may have to drive a bit.

You could go north -- not to the famed Johns Hopkins Hospital in Baltimore but to the Baltimore Washington Medical Center in Glen Burnie, Md.

Or you could drive west, over the Blue Ridge to the Shenandoah Valley, where Winchester Medical Center is located.

Those are the only two hospitals in the region that made it onto the list of top 100 hospitals compiled by Solucient, the health information subsidiary of Thomson.

On the other hand, if you consult the Web site of a company called HealthGrades, you might head for two of Inova's hospitals, Alexandria and Fairfax, which boast five-star ratings in three clinical areas.

If you prefer to rely solely on government data, you might go for Shady Grove Adventist Hospital in Rockville or Winchester Medical, or two of the District's university teaching hospitals, Howard and George Washington.

Welcome to the baffling new world of hospital scorecards.

The good news is that the number of organizations trying to evaluate hospitals is growing fast, along with the breadth and sophistication of their analysis. It's all part of the movement toward consumer-driven health care, and much of the information is available on the Web, where you can sort it by region, type of hospital or the particular ailment or procedure you're interested in.

The problem, however, is that hospitals that look great according to one data set often come up short on other scorecards. Over time, the wide variations should decline as the rating industry develops better analytical tools. But for the moment, it can all be bewildering.

The charts accompanying this column give just a quick sense of the kind of data you'll find. They are not, in any sense, a definitive survey. I simply picked a dozen that I hoped would be representative.

The first chart calculates the average score for each hospital on 19 of the quality measures the Department of Health and Human Services has developed so far. These range from what percent of heart attack patients receive an important type of drug on arrival to the percent of surgery patients who receive preventive antibiotics one hour before incision. The logic is that these are procedures that are known to correlate with positive medical outcomes and should be part of competent treatment.

Taking averages like this is, at best, a rough measure of hospital quality. It focuses too heavily on just a few ailments for which there are data and ignores entire departments and specialties. Moreover, this approach gives equal weight to whether a hospital gives a timely dose of antibiotics to a pneumonia patient and whether it gives heart attack patients anti-smoking counseling upon discharge. There is suspicion that some hospitals are altering their medical records to raise scores and make treatment look better than it is.

The government's data set on every hospital is free and easy to use. It's at http://www.hospitalcompare.hhs.gov.

Another viewpoint comes from HealthGrades, a private research firm, which like the government starts with the coded patient records reported to public and private insurers. But unlike the government data, HealthGrades focuses on medical outcomes -- how things turned out for the patients -- rather than what was done for them. Then it adjusts these results for the severity of the disease at the time of admission and the age and known health status of the patient. Hospitals earn a five-star rating from HealthGrades if the risk-adjusted outcomes in any area turn out better than what would have been expected, based on national norms. The accompanying chart lists which departments at the selected hospitals outperform the national averages.

HealthGrades also computes a "patient safety" rating to reflect how well each hospital does in preventing avoidable problems following surgeries and other procedures. There are 13 safety indicators -- from bedsores to picking up an infection. These data, too, are adjusted for the mix of patients and severity of cases, and compared to national norms.

One conclusion you could draw from this chart is that if your problem is a relatively common one that doesn't require the sophisticated specialists of a major teaching hospital, you might be better off at a community hospital where safety records tend to be better. Academic hospitals, after all, have a complex mission that requires them to make tradeoffs between patient care, medical research and training. So it's probably not surprising that the patient care may not be as good as at a hospital where that is the only priority.

As you might expect, teaching hospitals complain that HealthGrades' "risk adjustment" process doesn't fully account for the difficulty of their cases. And others criticize HealthGrades for charging hefty fees for full reports on each hospital, and charging hospitals for the right to publicize their "five star" designation. But for consumers looking for valuable rankings, there's still plenty of free information at http://www.healthgrades.com.

Another company, Solucient, compiles detailed reports meant primarily for hospital administrators and board members, not consumers. Its rankings are based not only on risk-adjusted medical outcomes, but also on financial performance measures such as cost, profitability, cash-to-debt ratio and growth in patient volume -- which turn out to correlate surprisingly well with medical performance. The detailed reports are sold only to the hospitals. The only information available to the public at Solucient's Web site ( http://www.100tophospitals.com) is the listing of the year's top 100 hospitals -- 20 in each of five categories.

Several years ago, after receiving its somewhat disappointing Solucient report, Winchester Medical Center set a goal of making the Top 100 list by 2008. It hired a consultant, altered its executive compensation to put a bigger emphasis on quality and organized teams in every department to implement small changes in procedures that translate into big improvements in its quality score.

After nurses took extra time to take fuller medical histories, and body hair was clipped rather than shaved before surgery, and the timing was changed on when antibiotics were administered, for example, surgical infection rates fell by 75 percent.

And by creating special emergency teams to focus on heart attack patients as soon as they were picked up by an ambulance, Winchester reduced the time it took to get a patient to a catheterization lab to 40 minutes from an average of 120.

The only way American health care is going to get better and more affordable is with this kind of single-minded focus on protocols known to work. However imperfect they may be, hospital scorecards are crucial to making that happen, creating the competitive "race to the top" that health reformers have dreamed of for decades.

Steven Pearlstein can be reached atpearlsteins@washpost.com.


© 2007 The Washington Post Company

Network News

X My Profile
View More Activity