Medical school deans were recently asked by a physicians' magazine, "Private Practice," to pick the 10 best and the 10 worst of the nation's 126 medical schools. Whereupon, with the swiftness of a healthy knee jerk, the head of the trade association of all these institutions advised against answering the inquiry.

"I hope the deans would not respond to this survey," wrote John A. D. Cooper, president of the Association of American Medical Colleges. Why not? The reason, he explained in his weekly report to members, is that "there is no way, because of the diversity of our medical schools, with each making its own contribution in educating physicians, conducting biomedical research and improving health care, to intelligently determine which is the best and which is the worst in the country."

Okay. Let's acknowledge the difficulties of applying Guide Michelin principles to medical education. But it should also be recognized that the recommended rejection of the inquiry is part of organized medicine's united front against any kind of public rating system. What the medical guild regularly advises the public is that the profession's own quality-control symbols -- diplomas, specialty certification and hospital privileges -- tell you all you need to know, that those framed documents in the doctor's office or in the hospital lobby are evidence or rigorous standards properly met.

The trouble with that rosy view of medical quality is that it addresses only one part of the system, because, while it serves to illuminate the peaks of medicine, it tells nothing about the pits. Is the public to believe that all board-certified surgeons perform with equal skill? That there aren't widely differing recovery rates -- including the telltale matter of post-operative infection -- among hospitals in a given community? Or that doctors aren't as vulnerable to alcoholism, drug abuse and other wear and tear as the rest of us?

On this point, survey after survey finds a relatively high incidence of emotional difficulties among physicans; yet it is extremely rare for this supposedly self-policing profession to crack down on one of its own -- before he or she does serious damage to a trusting public. The common explanation that you hear among doctors is that they're reluctant to deprive a colleague of a livelihood. And there's always the fear of a legal counterattack when one professional challenges the performance of another.

The net effect of this medical solidarity is that, for public consumption, all physicians and medical facilities are almost invariably depicted not only as equal, but also as excellent. Now and then, of course, a hospital gets so raunchy that not even the brotherhood of medicine can tolerate it, and formal accreditation if lifted. In most cases, though, that doesn't stop the hospital from staying in business.

In opposition to leveling with the public, doctors argue that it takes a doctor to judge a doctor; that, for example, the best surgeons often get the sickest patients and that public ratings of medical outcomes n such circumstances would be misleading. Apart from the radical fringe of medicine, the biggest concession to public information is in behalf of detailed community directories that list objective items such as hours, fees, training, hospital affiliations, etc.

The shortfall with that is that it doesn't reveal whether a doctor has performed two or 200 appendectomies, whether he's ever been reprimanded by his local medical society or whether he's ever flunked a postgraduate training course.

That's the sort of stuff that people need to know and that organized medicine refuses to reveal -- at least on a publicly mandated basis. The federal government's little-publicized but potentially powerful network of Professional Standards Review Organizatons -- run by doctors to oversee federally reimbursed doctors -- has collected voluminous information about the performance of individual physicians. So far, however, it's all been tightly held, which is what most doctors want.

Which brings us back to identifying the 10 best and 10 worst medical schools. It's doubtful that the nation's medical deans would differ markedly on listing the top and bottom -- any more than you'd fine widly differing quality listings among professionals concerned with cuisine, baseball, ballet or car repair. And it would probably do the bottom of the heap -- and its future patients -- a lot of good to get out from behind medicine's protective cloak.

I don't doubt that doctor knows best. What's lacking is his willingness to tell the rest of us.