AAOS Bulletin - June, 2006

Human factors in medical errors

Lessons from other fields applied to health care offer potential solutions

By John H. Harp, MD

April 26, 2006, marked the 20th anniversary of the Chernobyl nuclear accident. This unprecedented disaster released 100 times more radiation than the detonations at Hiroshima and Nagasaki. The subsequent investigation revealed that a flawed reactor design, organizational factors and human error led to this landmark event.

Technologies that are inherently hazardous—such as space travel, nuclear power generation and aircraft carrier operations—are a part of everyday life. Although terrible events occasionally occur, they are rarely repeated. Improved facility and equipment design, increased awareness of organizational failure modes and the development of operation and control systems to accommodate human factors are the reasons these potentially high-risk industries continue to perform silently and reliably.

Another potentially hazardous industry is health care. The ancient dictum, “primum non nocere,” recognizes that any medical intervention could lead to possible patient injury. The majority of these interactions achieve the desired goal of improving a patient’s health. When things go wrong, however, the same three components are present: the applied technology, the organization providing the technology, and the element of human error.1 In addition, there is the patient.

Carrying over a framework

The human interface with complex, high-risk technologies has been examined in the literature. But little research addresses the human component in adverse events in health care. It is worthwhile to see whether the literature addressing the traditional technologies of space travel, nuclear power generation and transportation can be used to develop a framework for examining the role of human error when adverse events occur in the practice of orthopaedic surgery.

First, a model must be employed to describe adverse events in health care, specifically in orthopaedic surgery. One useful model is based on work by W. van Vuuren2 and is shown in Figure 1. As defined by T.A. Brennan, an adverse event is…an injury resulting from medical treatment, as opposed to the underlying disease process, that prolonged a patient’s hospitalization, caused disability at the time of discharge, or both.3

Dangerous situations occur as a result of failures from one or more of the interacting components, such as the surgeon or organizational or technical factors. In this figure, surgeon refers to the human involved in the process, not necessarily the surgeon on record. Organizational factors include the systems outside the surgeon’s direct control—such as the hospital, ambulatory surgical center, clinic, emergency department or operating room where a dangerous event occurs.

Technical factors are anything other than the human or organization involved in the care of the patient. Technical factors in the operating room range from the instruments and implants used to the supplies, drapes and environment itself. In the clinic, technical factors would include communications systems as well as equipment, instruments and supplies. The patient factors component could encompass problems caused by the patient’s disease process or modifiable factors the patient has not changed, such as failing to comply with instructions.

Once an event starts, there is usually a return to safe conditions because of the inherent defenses in the overall system. When these defenses are inadequate, an adverse event occurs.

The human component

Within this model, we can focus on the human component. The human factor may or may not be present in a particular adverse event. When the human factor is present, an error has occurred. A framework devised by Rasmussen is helpful in describing error modes at this level. This framework classifies human performance at three levels: skill, rules and knowledge.4 In problem solving, a person may use one level at a time, different levels simultaneously or switch between levels.

Skill-based activities involve the execution of stored, preprogrammed directives. Examples of orthopaedic skill-based activities include suturing, making a cast or splint, or interpreting X-rays. Failure modes at this level can take numerous forms, but are usually slips and lapses caused by inattention or overattention at critical steps in the action sequence. Interruptions and distractions can also disrupt the performance of these automatic routines, such as “forgetting” to sign a progress note after receiving a phone call at the nurse’s station.

Rule-based activities attack familiar problems with stored instruction such as “if the forearm fracture angulation is less than 10 degrees and the patient is less than 10 years old, then the reduction is acceptable.” Errors at this level are usually the result of misapplication of a good rule, or application of a bad rule. Examples specific to orthopaedic surgery include the failure to implement a procedure to prevent wrong-site surgery or the administration of a prophylactic antibiotic to a patient who is allergic to that drug.

People with high levels of expertise in an area (such as experienced orthopaedic surgeons) tend to perform primarily in the rule- or skill-based domains. Experts have many more rules—usually abstract in nature—available to solve problems. They are also more likely to make an error when applying a learned rule in a “strong-but-wrong” form.

Knowledge-based activities occur in novel situations when solutions to problems are being formed in real time, using analysis and stored knowledge. Orthopaedic surgeons commonly use this performance domain when making clinical decisions in the emergency department or examination room. Essentially, the surgeon uses stored facts to build a custom solution to a patient’s particular dilemma.

Mental shortcuts

The human mind, however, is less than perfect when solving open-ended problems with many variables. The tendency is to create a solution in the most efficient manner by “shortcutting” the process, and using well-described biases or “cognitive dispositions to respond.”5 These errors include, but are not limited to, jumping to conclusions, seeing what you expect to find, availability/non-availability heuristics, bias toward action rather than inaction, confirmation bias, diagnosis momentum, fundamental attribution error, gambler’s fallacy, omission bias, order effects, overconfidence bias, representativeness restraint, Ockham’s razor error, triage cueing and counter transference. As you can see, a wide variety of cognitive distortions can influence knowledge-based problem solving.

Reason4 believes that two modes of knowledge-based problem solving are inherent to human thinking: similarity matching and frequency gambling. In similarity matching, people try to solve problems by trying to match a new situation to previously solved problems. If there is no match, people will gamble that the answer most frequently used under similar past situations will be effective.

The three levels of activity form a hierarchy—with skill-based activities being automatic and the most reliable while knowledge-based activities are the most complex, subject to human biases or “cognitive dispositions to respond” and the most prone to error. This is true across all highly technical, potentially hazardous industries.

Safety strategies

Industries outside health care have used these concepts to improve safety and reliability in many ways. Several strategies become apparent when the human component in adverse events is examined.

One strategy to reduce errors at the less reliable domain of knowledge-based performance is to move problem solving to the rule- and skill-based domains. This means simplifying tasks. For example, clinical algorithms can be used to determine next steps. This, however, is highly controversial and subject to emotional debate about restricting physicians’ freedom to practice medicine as an art and not out of a textbook. But the evidence from other technical industries indicates that such a move will lower error rates.

Another strategy to reduce errors would be to limit distractions when problem solving. Setting aside time free from external interruptions is important when reviewing laboratory data or other reports. Controlling or eliminating distractions while operating or making decisions in the emergency room may lead to making better decisions.

Likewise, using checklists or preprinted orders will help prevent errors that result from working from memory. Slips, lapses and other errors can occur when using a memorized order list or preoperative working sequence. Checklists and preprinted orders can be effective countermeasures.

Emergency rooms are a reliable source of higher-risk patients requiring higher levels of problem solving. Because the human mind itself is fallible, surgeons need to make sure all diagnoses are considered and not eliminated prematurely. Furthermore, surgeons should be aware of and basically understand the biases that affect decision making.

Simulator training is a useful method for gathering data about human error modes. Although its use is just beginning in the health-care industry, simulator training is gaining popularity as an alternative to patient-based teaching methods.

The study of human error and its impact on patient care is just beginning, but it should improve the overall safety of our health-care system.

John H. Harp, MD, is a member of the AAOS Medical Liability Committee. He can be reached at harpjh@tetonortho.com

References

1. Reason J. Managing the Risks of Organizational Accidents. Ashgate Publishing, 1997.

2. Van Vuuren W. Organisational failure: An exploratory study in the steel industry and the medical domain. Eindhoven University of Technology, 1998.

3. Brennan TA, Sox CM, Burstin HR. Relation Between Negligent Adverse Events and the Outcomes of Medical Malpractice Litigation. N Engl J Med 1996; 335(26): 1963-67.

4. Reason J. Human Error. Cambridge University Press, 1990.

5. Croskerry P. Achieving Quality in Clinical Decision Making: Cognitive Strategies and Detection of Bias. Acad Emerg Med 2002; 9:1184–1204.7.


Close Archives | Previous Page