Frightening headlines about the high incidence of harmful medical errors in hospitals have screamed from the front pages of America’s newspapers for twenty-five years, claiming that an estimated 100,000 die annually. In response, the medical profession has relied on the systems approach, which emphasizes the repair of flawed delivery systems as opposed to punishing defective caregivers. I learned as a hospital chief of staff for five years that the blame and punish method is much harder to carry out than the systems method.
The systems approach is derived from industrial psychology and has found a home in industry especially in airline safety and automobile manufacture. If cars are coming off the assembly line with flaws, looking at the manufacturing process is the most helpful way of eliminating them. Certain problems, such as reducing hospital caused infections are amenable to systems resolutions as we shall see below; others are not.
The profession is currently reeling from the results of two studies released in November, 2010, one by Medicare and the other by the Harvard School of Public Health, that show that in the last decade the systems method has not reduced the rate of caregiver caused injuries. Some think that this is because of a lack of provider compliance with systems techniques. Compliance is difficult and time and money consuming. It is also devilishly difficult to enforce and is therefore far from universal. The fact remains, however, that the data collection and analysis techniques used in the above studies and all the antecedent ones on hospital adverse events are blind by design to the errors of individual providers. In spite of this, a joint statement of the combined licensing boards of Canada and the U.S. in 2008 called for greater leniency toward the doctors reported to the boards since many of them were victims of flawed systems.
One of the main avatars of the systems approach published a book in 2010. Dr. Peter Pronovost, an anesthesiologist and intensivist at Johns Hopkins University School of Medicine, has written Safe Patients, Smart Hospitals about his career-long quest for medical safety, It is a well written and important contribution. Dull, routine, common bedside invasive procedures can be deadly if not done to perfection. In his most important contribution, Pronovost persuaded many ICU doctors to use maximal sterile precautions for putting in central intravenous lines which otherwise get infected 4% of the time and kill 1% of patients. In Michigan, the location of his greatest triumph, Dr. Pronovost’s disciples pushed the infection rate down to 0% and have kept it there, saving thousands of lives.
Pronovost’s way of doing things replaces the “See one, do one, teach one” method of passing the tribal knowledge of bedside procedures from one resident to another by using an essential triad: a check list of steps and equipment, a change in the culture of the hospital, and the meticulous recording of outcomes.
Despite his successes, Dr. Pronovost is not a systems purist. He is wise and experienced enough to know that there is another side to medical errors which has not been studied systematically. These situations are resistant to the systems method because they involve stubborn flaws in individual caregivers. He describes two egotistical, uncooperative physicians who endanger patients’ lives. Pronovost describes how he pleaded with a surgeon over the phone to come in to see a patient with a high fever and shock that he had operated on during the past day and was dying. The surgeon refuses to come in. A second surgeon takes the case, operates and discovers holes in the pancreas and small intestine caused by the original surgery. In the second episode, Pronovost is giving anesthesia to a patient whose blood pressure drops abruptly and responds only transiently to epinephrine. He diagnoses an allergy to the latex in the surgeon’s gloves. The surgeon changes gloves only after Pronovost threatens to call the medical school dean and the chief administrator.
Dr. Pronovost made a fruitless search for data on the persistence of errors in diagnosis and treatment, something that has been his concern since his father succumbed to blood cancer. When Peter Pronovost was a college student, the doctor who initially managed his father’s leukemia did not suggest a bone marrow transplant. A credible second opinion came too late: a transplant might have saved his life.
There are other types of errors that are more reasonably attributed to flawed individuals than flawed systems. I have culled good examples from newspaper headlines: 600 unnecessary cardiac surgeries in Redding California, 100 poorly done radioactive implants for prostate cancer in Philadelphia and 550 inappropriate coronary artery stents in Baltimore. These occurrences beg the questions: What is the best way to search for the sources of medical errors? Is it most appropriate to look for bad systems or bad doctors or both?
Avoiding physician errors of denial, availability, over treating, and poor diagnosis and treatment will require a greater data gathering effort and a greater change in the hospital culture than what Dr. Pronovost has achieved in preventing hospital caused infections. There are great ethical and political difficulties in gathering the data and enormous cultural barriers to change within the profession. I witnessed these on a daily basis when I was a medical staff leader. Is there a Peter Pronovost to address these concerns?