washingtonpost.com
Which Docs Measure Up?

By Harlan M. Krumholz
Sunday, May 31, 2009

A patient turned to me for advice recently after being diagnosed with early stage prostate cancer. A hard-charging Vietnam veteran who exercised every day, he had prided himself on his excellent health, and the news scared him. He had undergone several rounds of biopsies in response to an abnormal blood test. Finally, he got the same unhappy diagnosis that about 200,000 men get each year.

After considering treatment options, he chose to have surgery to remove his prostate. I'm a cardiologist, not a urologist, but we happened to have met at the gym, so he asked me where he should have the surgery performed and whom I would pick to do it. I wasn't sure. It's a tricky surgery with many possible complications. The prostate is the size of a walnut and nestled in a difficult location at the base of the pelvis near some important nerves. Almost everyone survives the surgery, but afterward, patients may be incontinent or impotent, problems that get most men's attention.

This patient had nowhere to turn to figure out which doctors and hospitals had the best results and the lowest risk of these complications. His dilemma is the same one that virtually every patient -- and the entire health-care system -- is facing: How can you measure quality in an area in which your life may be at stake?

Need a new hip? A hernia repair? You're out of luck if you want to look at a doctor's track record or an institution's success rates. Results vary by surgeon and by hospital; you just have no way of knowing which one is best. And often, neither do they.

For most patients, the decision of where to seek care comes down to a recommendation based on hearsay. Good reputation plays a role, but unfortunately studies show that just because you have a famous name doesn't mean that you're good.

Even doctors don't know what to do. I broke my collarbone in a bicycle accident a few years ago and had no good way of selecting a surgeon. I picked someone based on advice from colleagues, but neither they nor I had any way of knowing what his past results for this operation -- or any operation -- had been.

Our health care system has a grasp of the astronomical amounts spent on care, but we have little information about the overall results that we achieve. We lack a trustworthy source of information in most areas of medicine to guide this most critical choice. We don't have a Consumer Reports for doctors and hospitals -- at least not yet. For-profit ranking systems, such as the 100-best-doctors-in-your-area feature found in glossy magazines or online, often do not fully reveal their methods or submit their measures to independent peer review. Patients almost always have to make blind choices about where to receive elective care.

The paucity of information about medical performance not only makes it hard for patients to choose care. It also impairs our ability to improve care. If we in the medical profession could measure results, we could weed out bad practices and nurture the good ones -- and save more money and lives than we could with virtually any breakthrough procedure or medication we are likely to see in the near future.

A decade ago, in my field of cardiology, Medicare decided to measure how people who had heart attacks were treated. The effort revealed embarrassing gaps in treatment nationwide. Some solutions were as simple as making sure that patients took aspirin, which reduces the risk of dying within a month by about 20 percent. Everyone agreed about the value of the treatment, but we were neglecting to provide it -- and some places were doing a lot worse than others. The Medicare study led to the development of systems to ensure that the right patient gets the right medication at the right time.

More recently, another campaign, focusing on the timeliness of heart attack care, led to further improvements. Patients who need emergency angioplasty are being treated more quickly than ever, in some cases so quickly that the heart attack is interrupted before severe damage can be done.

Why don't we have more measurement and results available for everyone to use? So far, one reason is that we all -- both patients and physicians -- lack the will. Patients tend to defer to doctors and are reluctant, when facing a medical crisis, to push for information about performance. Doctors are perfectly happy with this lack of scrutiny and prefer not to have their authority questioned. Systematic and continuous measuring of performance is not part of the culture of medicine.

Incentives are lacking, too. The few professional groups that have tried to measure performance, such as the American College of Cardiology and the Society of Thoracic Surgeons, have done so at great expense and peril. Hospitals have helped cover the steep costs of collecting data, but the question of how to use the data is controversial. The doctors' groups feel that the data should be used to improve quality and not to help patients find a better doctor. But they are being pressured to make their results public. The net result for now is that the data are shielded from public view. We need to reward physician groups that embrace both performance measurement and public accountability.

Good measurement is not easy and bad measurement can lead to practices that hurt patients. A decade ago, hospitals began to implement programs to reduce the time patients spent there. The shorter stays helped the hospitals financially -- and the focus of these programs was on the bottom line. We are only now beginning to realize that those policies increased the risk that the patient would need to be rehospitalized within a few weeks. All we were measuring was the time in the hospital, not what happened to the patients.

Some critics seize on such examples as a reason not to assess performance at all. These individuals seem trapped in the past, when we asked the public to assume that every doctor and every hospital provided outstanding care. We know far too much now about variation in performance to return to that flawed assumption. We still aspire to have everyone achieve a high level of performance, but we need measures to guide us.

In an era of highly sophisticated metrics in sports, medicine remains in the Dark Ages by comparison. Baseball, for example, has spawned an entire field of statistical analysis known as sabermetrics. If Billy Beane, the data-driven general manager of the Oakland A's, can use performance ratings to maximize his lineup's production, then surely a prostate-cancer patient should be able to use some stats to find the best surgeon he can.

Some members of Congress and the Obama administration want government to make better measurement and more transparent information a key part of health-care reform. This summer, Medicare will step up to the plate and report every hospital's 30-day death and readmission rates for patients admitted with heart attacks, heart failure and pneumonia. The point of making such information public is not to encourage patients to shop for a hospital during an emergency, but rather to stimulate a discussion about performance and provide an incentive for improved care.

So what happened to the patient with prostate cancer? He went to a doctor his internist recommended. He heard that the doctor used a fancy new robotic surgery device and assumed that this meant that he was good. Six months later, he occasionally loses control of his bladder, and his sexual function is not what it was. He is left wondering whether he made the right choice. Meanwhile his experience is not being tracked to help the next person choose or the surgeon and the hospital improve.

harlan.krumholz@yale.edu

Harlan M. Krumholz, a cardiologist, is a professor of medicine at Yale University.

View all comments that have been posted about this article.

© 2009 The Washington Post Company