Health care, including mental health care, is in serious trouble in this country, and the trouble is not simply the incursions of managed care. In fact, managed care, and behind it cost containment, is the symptom of a deeper problem - medical technology has outpaced our capacity to apply it safely and effectively to the relief of human suffering. We are now in the position of being able to do more things for sick people than society can afford. That stark reality often gets lost in the anguish over utilization review, loss of physician autonomy, and threats to patient privacy and confidentiality that are the stuff of managed care.
One proposal to bring costs in line is to ration care. Some feel that managed care does that in effect. But another possibility is that technologic innovation has outrun innovation in the way in which we deliver care, and that it is time to change the way we practice. A sailing captain from the 1830s placed on the bridge of a modern ocean liner would be completely helpless. The ship would be vastly larger and faster. There would be no sails, the crew would be smaller, and the bridge would be filled with computer terminals and switches; there would be no steering wheel and no binnacle with a compass. He would not understand that the ship was guided by radar, the global positioning system, and the autopilot on a course that had been programmed into the computer months before.
A physician from the 1830s placed with a patient in a modern physician's office would feel right at home, at least for awhile. He would sit at his or her desk, take a chief complaint and a history, perform a physical examination, and try out a series of diagnoses that would likely be in the correct organ system. He would begin to feel lost only when it came to diagnostic tests, or the finer points of diagnosis and treatment, at which time scientific medicine would kick in and the resurrected physician would be as much at sea as his seagoing contemporary.
The evidence that clinical practice has not kept up with technologic innovations comes from three areas - practice variation, patient safety, and quality of care.
Dr. John E. Wennberg of Dartmouth Medical School, for example, has demonstrated wide geographic variation Ln the care of women with breast cancer. The decision to use lumpectomy versus mastectomy seems to be based more on regional patterns of practice than on either best practices grounded in research or the personal preference of patients for one or the other form of treatment.1
Another line of research-studies of patient safety and medical error - raises somewhat different concerns about traditional modes of practice. Lucian Leape, extrapolating from reviews of hospital charts, estimates that 180,000 patients die each year partly as a result of medical errors.2 That is the equivalent of three jumbo jet crashes every 2 days! Many physicians are skeptical that things are that bad. But even if Leape is exaggerating threefold, that is still the equivalent of one jumbo jet crash every 2 days.
The medical errors Leape discusses include missed diagnoses, failure to treat promptly, drug overdoses, and giving the wrong drug. Although errors occur in as few as 1% of the many events that happen to a patient in the course of an illness, this is a much higher error rate than is tolerated in many other industries that apply technology to human safety and welfare. The problem is twofold: first, our certainty that information from other industries does not apply to us (a response typical of threatened industries); and, second, our reliance on a system of care that assumes that physicians can practice safely and at the highest standard if only they are sufficiently intelligent, well trained, and conscientious. As a result, we have not made use of a considerable body of findings from human factors research that demonstrate that human error is inevitable, but can be minimized or prevented by creating systems around individuals that make it much more difficult or even impossible for them to make mistakes.
Finally, there is deep concern in some quarters about the quality of care that is actually being delivered to patients. A recent consensus report of the Institute of Medicine of the National Academy of Sciences titled "The Urgent Need to Improve Health Care Quality" reviewed many of the studies of quality of care.3 The report identified three categories of quality problems. The first was underuse: failure to immunize 100% of children, or prenatal care begun too late to prevent the complications of pregnancy would be examples. (Contrary to expectations, a number of the studies cited by the report found that underuse was more common in fee-forservice than in managed care plans.) Misuse, the second category, involves injury from the preventable complications of treatment and is related to medical error. The third category, overuse, was also found to be common.
The report comes down hard on the inadequacies of the system of care: "The burden of harm conveyed by the collective impact of all of our health care quality problems is staggering." It goes on to say.
Meeting this challenge demands a readiness to think in radically new ways about how to deliver health care services and how to assess and improve their quality. Our present efforts resemble a team of engineers trying to break the sound barrier with a Model T Ford. We need a new vehicle or, perhaps, many new vehicles. The only unacceptable alternative is not to change.
The articles that make up this issue of Psychiatric Annals are a first step in addressing these problems in the practice of psychiatry. They describe computerized algorithms that represent best practices in the care of certain patients with complex clinical conditions. Their availability in psychiatrists' offices should reduce practice variation, make it possible for busy clinicians to keep up with developments in the field, and reduce errors based on lack of knowledge of the right thing to do. As such, they are a forecast of the future of clinical practice - higher quality, more up-to-date, and safer.
But note the care with which the authors couch the use of these algorithms. Deferring as always to the tenacity with which clinicians hold to professional autonomy, they believe that practitioners must use their own clinical judgment in treating and addressing the needs of each individual patient, taking into account that patient's unique clinical situation. Would they recommend that the same latitude be given to airline pilots to accept or ignore the rules of flight set by the Federal Aviation Administration and enforced by the commercial airlines? After all, each day is different, the weather is different, and each aircraft is different. What a paradox that we insist that commercial aviators fly by the checklist, whereas we allow professionals in an equally hazardous industry - health care-to be so distantly accountable for the quality and safety of their activities. That commercial aviation is vastly safer than modern medical and psychiatric care is clearly no accident - in both senses of the term!
1. Wennberg JE. The Dartmouth Atlas of Health Care in the United StatesChicago: American Hospital Association; 1996:128-129.
2. Leape LL. Error in medicine. JAMA. 1994;272:1851-1857.
3. Chasin MR, Galvin RW. The urgent need to improve health care quality. JAMA. 1998;280:1000-1005.