Previous Next

SAFETY CULTURE OF THE OPERATING ROOM ENVIRONMENT

Analyzing the Anesthetist's Domain: The Operational World Versus the Organizational World

Psychologists and cognitive engineers who study real work environments describe each field as a domain. Each domain has specific characteristics that set it apart from others, including the nature of the tasks to be performed, the relationships among the tasks, the time scale over which they must be executed, and the criteria for their successful performance. This chapter deals primarily with the operational domains in which anesthesia care is delivered, primarily the OR, the PACU, and the ICU. However, as pointed out by Reason,[82] [127] [200] [281] [282] as well as by Woods and Cook and colleagues, [62] [63] [122] [283] what goes on in the operational domain is extensively shaped by the organizational and managerial environment in which it is located, even to the point that operational personnel believe themselves to be the "victims" of problematic decisions further back in the system.

In everyday practice, these distinctions are hidden or blurred. The positive and negative contributions of the organizational and management elements are often so embedded in the normal routine that they are difficult to isolate. Interesting information about the system often comes from considering abnormal situations, accidents, or near misses, instead of normal events. For example the investigation of the loss of the space shuttle Columbia revealed a variety of latent errors within the procedures and safety culture at NASA: "The accident was probably not an anomalous, random event, but rather likely rooted in NASA's history and the space flight program's culture."[284] The report placed as much weight on these factors as on the direct technical causes of the accident. Traditionally, one speaks of errors arising in decisions and actions that led to a mishap. But the term "error" is increasingly considered an inappropriate way to categorize behaviors (being a judgment of attribution and blame) and should be thought of merely as a way to identify behaviors at the locus of a critical situation. In this context it must be understood that "errors are not the cause of an accident." Rather, errors are usually the consequences of the combination of several underlying factors. Beyond that, errors must combine with other circumstances to result in an accident or adverse outcome[200] [285] This is illustrated in Figure 83-11 . [286]


Figure 83-11 Model showing that errors are not the cause of accidents [200] : The "naïve view" that errors (E) are the direct cause of an accident is incorrect. As shown in the "modern view," a combination of underlying conditions (C1 , C2 , C3 ) are the real causes of an error. The error itself often requires further combination with contributing factors (CF4 , C5 ) to result in an accident. (Adapted from Rall M, Manser T, Guggenberger H, Gaba DM, Unertl K: [Patient safety and errors in medicine: Development, prevention and analyses of incidents.] Anästhesiologie, Intensivmedizin, Notfallmedizin, Schmerztherapie 36:321–330, 2001, with permission.)

Some errors are actively produced in the operational domain, whereas others are introduced by the organizational environment. James Reason, a psychologist at the University of Manchester, England, described the latter using the concept of "latent errors"[82] [281] :

... errors whose adverse consequences may lie dormant within the system for a long time, only becoming evident when they combine with other factors to breach the system's defenses. [They are] most likely to be spawned by those whose activities are removed in both time and space from the direct control interface: designers, high-level decision makers, construction workers, managers and maintenance personnel.

Latent errors probably exist in all complex systems, and Reason adopted a medical metaphor to describe them as "resident pathogens." Like microorganisms in the body, the resident pathogens remain under control until sets of local circumstances "combine with these resident pathogens in subtle and often unlikely ways to thwart the system's defenses and bring about its catastrophic breakdown"[82] (see Fig. 83-13 ). This threat and error model was also well demonstrated by Helmreich at the University of Texas ( Fig. 83-12 ).[287]

Previous Next