Previous Next

Normal Accident Theory

NAT focuses primarily on two features of a system: (1) the complexity of interactions among the system's elements and (2) the presence of tight coupling among the system's elements. A system is tightly coupled when a change in one part of the system rapidly alters other parts of the system. For example, some physiologic systems are buffered from changes in others, while certain core components, such as oxygen delivery and blood flow, are tightly coupled together and interact strongly. The patient's physiology may become tightly coupled to external systems such as ventilators and infusions of hemodynamically active drugs. When complexity and tight coupling coexist, abnormal sequences of events can sometimes be hidden, and have complex or unpredictable consequences. Typically, active errors in the system do not result in an accident because they are trapped at some point by the system's multiple layers of checks and defenses (see Fig. 83-13 ). Even a minor perturbation can cause normal system behaviors to be out of control when there are complex interactions and tight coupling. Perrow called this a "normal accident" because the perturbations are common and arise out of otherwise normal system operations. He suggested that attention should be directed at strengthening the recovery pathways by which small events can be properly handled before they evolve into a serious accident.


3050

TABLE 83-11 -- Complementary views of high reliability organization theory (HROT) versus normal accidents theory (NAT)
HROT NAT
Accidents can be prevented through good organizational design and management. Accidents are inevitable in complex and tightly coupled systems.
Safety is the priority organizational objective. Safety is one of a number of competing objectives.
Redundancy enhances safety; duplication and overlap can make a reliable system out of unreliable parts. Redundancy often causes accidents: it increases interactive complexity and opaqueness and encourages risk-taking by social shirking and cue taking behavior.
Decentralized decision making is needed to permit prompt and flexible field-level responses to surprises. Organizational contradiction: decentralization is needed to handle distributed complexity but centralization is needed to manage tightly coupled systems.
A "culture of reliability" will enhance safety by encouraging uniform and appropriate responses by field-level operators. A military model of intense discipline, socialization, and isolation is incompatible with democratic values.
Continuous operations, training, and simulations can create and maintain high reliability operations. Organizations cannot train for unimagined, highly dangerous, or politically unpalatable operations.
Trial and error learning from accidents (through effective incident reporting) can be effective and can be supplemented by anticipation and simulations. Denial of responsibility, faulty reporting, and biased reconstruction of history frequently cripple learning efforts.
Modified from Sagan SD: The Limits of Safety. Princeton, Princeton University Press, 1993.

According to the NAT view, we delude ourselves by believing that we can control our hazardous technologies and forestall disaster; in reality, many of the efforts we make at management and design tend only to increase the opacity and complexity of the system (making more holes in the barriers), thereby increasing the likelihood of accidents. The combination of these factors provides fertile ground for the occurrence of accidents—indeed, according to NAT it makes it inevitable that some of the "normal" everyday faults, slips, and incidents will evolve into tragic accidents. NAT is often considered a "pessimistic" view of the ability of organizations to conduct high-hazard operations without harmful errors.

The very concept of risk is constructed and negotiated. This was perhaps expressed most clearly by Dianne Vaughan,[306] in her powerful analysis of the explosion of the Challenger space shuttle:

Risk is not a fixed attribute of some object, but constructed by individuals from past experience and present circumstance and conferred upon the object or situation. Individuals assess risk as they assess everything else—through the filtering lens of individual worldview.

Vaughan's main thesis is that the Challenger exploded not because "risk-taking" managers "broke the rules," but because the system evolved to make excessive risk a part of "following the rules." This happened because the system had a "culture of production" with ingrained production pressure; it was a regular occurrence to explain away aberrant findings ("normalization of deviance"), and "structural secrecy" was enforced between departments, between manufacturers and NASA, and between engineers and managers. Sadly, many of these same phenomena occurred yet again leading to the Columbia accident.[284] Unfortunately, each of these system characteristics is also operative in perioperative settings.

Previous Next