Complexity and Tight Coupling: Latent Errors in Anesthesia
Clearly, the anesthesia domain involves complex interactions with
tight coupling.[64]
The complexity stems to some
degree from the variety of devices in use and their interconnections, but these are,
in truth, vastly simpler
* See references [30]
[81]
[101]
[103]
[109]
[110]
[111]
[112]
[135]
[206]
[207]
[220]
[263]
[264]
[269]
[270]
[278]
[279]
[280]
[307]
.
Figure 83-12
University of Texas Threat and Error Model. This model,
which illustrates the evolution of errors, is used in the analysis of incidents and
accidents (a detailed example is available at http://bmj.bmjjournals.com/misc/bmj.320.7237.781/sld001.htm).
than those found in an oil refinery, a 747 aircraft, or a space shuttle. A more
important source of complexity is the "uncertainty complexity" of the patient.[64]
The human body is an incredibly complex system containing numerous components, the
interactions of which are only vaguely understood. Because many body systems affect
each other, the patient is a major site of tight couplings. Furthermore, the anesthetic
state tends to ablate the buffers among some of these interconnected systems, thus
strengthening the coupling among them and between the patient and external mechanical
supports. Galletly and Mushet[201]
studied anesthesia
"system errors" and observed tight coupling associated with "the use of neuromuscular
blocking drugs, the presence of cardiorespiratory disease, certain types of surgical
procedures, and from the effect of the general anesthetic agents. Looser coupling
was observed with the use of high concentrations of oxygen and air mixtures, preoxygenation,
and spontaneous breathing techniques."
A variety of latent failures can exist in the anesthesia environment.
They may include such issues as how surgical cases are booked, how cases are assigned
to specific anesthetists, what provisions are made for preoperative evaluation of
outpatients, and what relative priority is given to rapid turnover between cases
or avoiding the cancellation of cases as opposed to the avoidance of risk. Latent
errors can also result from the design of anesthesia equipment and its user interfaces,
which in some cases lead clinicians to err or are unforgiving of errors. Manufacturing
defects and failures of routine maintenance are also sources of latent failures.
Eagle and coworkers[288]
described
a number of active failures and latent failures in a case report concerning a severely
ill patient who died 6 days after suffering aspiration of gastric contents during
general anesthesia for cystoscopy. The initial urologist in this case made an active
error by booking the case inappropriately to be performed under local anesthesia.
This active error interacted with other latent features of the system. For example,
the OR scheduling system improperly allowed the urologist to be assigned to two different
cases simultaneously, which led to the surgical procedure's being transferred to
another urologist who was unfamiliar with the patient. The second urologist requested
a general anesthetic, at which point an anesthetist, who was equally unfamiliar with
the patient, was brought into the situation. Through this combination of events,
the seriously ill patient did not receive a thorough evaluation in advance of his
surgery. Specifically, the anesthetist was not aware that the patient had suffered
an episode of projectile vomiting at 4 AM on the
day of surgery. This information was available in the hospital's computerized record-keeping
system, but there was no computer terminal in the OR. The information was not contained
on the patient's chart. The nursing notes in the chart indicated that the patient
had been fasting for 24 hours. The second urologist and the anesthetist believed
that the case was an urgent addition to the OR list. They decided to go ahead with
the case despite their cursory evaluation of the patient.
The analysis of Eagle and coworkers reinforced the concept that
investigation of untoward events must address both latent and active failures and
both the organizational and managerial environment and the operational domain. One
risk of focusing solely on active failures is that the operational personnel believe
themselves to be victims of the system, making them defensive and unco-operative.
Rasmussen,[96]
as well as Cook and Woods and associates
[63]
[88]
[122]
[283]
pointed out that if one looks at the chain
of events in an accident sequence, one can always find a failure on the part of an
operator. If the analysis stops at this point, the operator (i.e., the anesthetist)
may be wrongly blamed for a failure, the real roots of which go back to latent failures
in the organization. If the underlying latent errors are never identified or never
fixed, they remain in the system and will likely induce another accident chain in
the future. This is visualized in Reason's accident trajectory, shown in Figure
83-13
.
Figure 83-13
James Reason's model of accident causation. Latent failures
at the managerial level may combine with psychological precursors and event triggers
at the operational level to initiate an accident sequence. Most accident sequences
are trapped at one or more layers of the system's defenses. The unforeseen combination
of organizational or performance failures with latent errors and triggers may lead
to a breach of the system's defenses allowing the accident to happen. The diagram
should be envisioned as being 3-dimensional and dynamic—with "shields" moving
around and holes in the defense's opening and closing.[320]
(Redrawn from Reason JT: Human Error. Cambridge, Cambridge University
Press, 1990.)