Previous Next

Core Process

An overview of the core process and its elements is given in Table 83-2 . The elements are explained in detail in the following sections.

Observation

Management of rapidly changing situations requires the anesthetist to assess a wide variety of information sources. These include visual observation of the patient and the surgical field, visual inspection of a multitude of displays from
TABLE 83-3 -- Methods for verification of critical observations
Method Explanation, Example
1. Repeating The observation or measurement is repeated to rule out a temporary wrong value (e.g., motion artifacts during noninvasive blood-pressure measurement).
2. Checking trend information The short-term trend is observed for plausibility of the actual value. Trends of physiologic parameters almost always follow curves, not steps.
3. Observing a redundant channel An existing redundant channel is checked (e.g., invasive arterial pressure and cuff pressure are redundant, or heart rate from electrocardiogram (ECG) and pulse oximeter).
4. Correlating Multiple related (but not redundant) variables are correlated to determine the plausibility of the parameter in question (e.g., if the ECG monitor shows a flat line and "asystole" but the invasive blood pressure curve shows waves).
5. Activating a new monitoring device A new monitoring modality is installed (e.g., placing a pulmonary artery catheter). This also adds another parameter for the method of "correlating."
6. Recalibrating an instrument or testing its function Checking the quality and reliability of a measurement and testing its function (e.g., if the CO2 detector shows no values, the anesthetist can exhale through it to see if the device works). Observation of redundant channels can also help to verify a value (see above).
7. Replacing an instrument If there is doubt about the function of a device, an entire new instrument or an alternative back-up device may be installed.
8. Asking for help If the decision on the values remains unclear, help should be sought early to obtain a second opinion from other trained personnel.

electronic monitors, visual observation of the activities of nurses and the coontents of suction canisters and sponges, listening for normal and abnormal sounds from the patient and equipment, interpreting radiographs, and reading the reports of laboratory test results. Because the human mind can only attend closely to one or two items at a time, the anesthetist's supervisory control level must decide what information to attend to and how frequently to observe it.

The multitasking involved in observing multiple data streams was probed by the experiments measuring secondary task performance and vigilance. The realistic simulation studies demonstrated the large number of information sources actually used during the response to a critical event. Routine parts of the core process operate primarily at the sensory, motor, and procedural levels and are executed repetitively throughout the course of an anesthetic. The results from the University of California San Diego (UCSD) and from the Department of Veterans Affairs (VA) and Stanford University (UCSD/VA-Stanford group) concerning the vigilance of experienced anesthetists compared with novices suggested that the novices had not fully developed their core process as a highly automated function, requiring them to devote more mental resources to routine activities.

Verification

In the OR environment, the available information is not always reliable. Most monitoring is noninvasive and indirect and is therefore susceptible to artifacts (false data). Even direct clinical observations such as vision or auscultation can be ambiguous. Brief transients (true data of short duration) can occur that quickly correct themselves. To prevent them from triggering precipitous actions that may have significant side effects, critical observations must be verified before the clinician acts on them. Verification uses a variety of methods, shown in Table 83-3 .

It should be noted, that if in doubt it should always be assumed that the patient is at risk and that the parameter


3027
in question is real ("rule out the worst case"). The assumption of a technical artifact should be the last step.

Knowing when and how to verify data is a good illustration of strategic knowledge (metacognition). For example, the anesthetist must decide under what conditions it is useful to invest time, attention, and energy in establishing a new information source (e.g., a pulmonary artery catheter) in the middle of a case, rather than relying on more indirect information sources that are already in place.

Problem Recognition

Whereas anesthetists are taught the importance of scanning their instruments and environment, they must use these observations to decide whether the patient's course is "on track" or whether a problem is occurring. If a problem is found, a decision must be made as to its identity and its importance. This process of problem recognition (also known as situation assessment) is a central feature of several theories of cognition in complex, dynamic worlds.[71] [95] [116] [117] [118] [119]

Problem recognition involves matching sets of environmental cues to patterns that are known to represent specific types of problems. Given the high uncertainty seen in anesthesia, the available information sources cannot always disclose the existence of a problem, and even if they do, they may not specify its identity or origin. Westenskow's[120] experiment with the intelligent alarm system mainly probed these parts of problem recognition. In this experiment, subjects were already alerted to the existence of a problem by an alarm, so they could immediately focus their attention specifically on information sources concerning ventilation. In 11 cases, the fault could not be identified, but there was successful compensation for the fault, as described in the next paragraphs.

The supervisory control level mediates the decision when a clear-cut match or "diagnosis" cannot be made. Anesthetists and other dynamic decision-makers use approximation strategies to handle these ambiguous situations; psychologists term such strategies heuristics.[121] One heuristic is to categorize what is happening as one of several "generic" problems, each of which encompasses many different underlying conditions. Another is to gamble on a single diagnosis (frequency gambling[98] ), initially choosing the single most frequent candidate event. During preoperative planning, the anesthetist may adjust a mental "index of suspicion" for recognizing certain specific problems anticipated for that particular patient or surgical procedure. The anesthetist must also decide whether a single underlying diagnosis explains all the data, or whether these data could be due to multiple causes.[63] [88] [122] This decision is important because excessive attempts to refine the diagnosis can be very costly in terms of allocation of attention. In contrast, a premature diagnosis can lead to inadequate or erroneous treatment.[123]

The use of heuristics is typical of expert anesthetists and often results in considerable time savings in dealing with problems. However, it is a two-edged sword. Both frequency gambling and inappropriate allocation of attention solely to expected problems can seriously derail problem-solving when these gambles do not pay off if the reevaluation component of the core process does not correct the situation.

Prediction of Future States

Problems must be assessed in terms of their significance for the future states of the patient.[71] [116] Those problems that are already critical or that can be predicted to evolve into critical incidents receive the highest priority. Prediction of future states also influences action planning by defining the time frame available for required actions. Cook and Woods[63] described "going sour" incidents in which the future state of the patient was not adequately taken into account when early manifestations of problems were apparent. It is also known from research in psychology that the human mind is not very well suited to predict future states, especially if things are changing in a nonlinear fashion.[124] [125] [126] Under such circumstances, which are not uncommon for natural systems like the human body, the rate of change is almost invariably underestimated and the people are surprised at the outcome.[127] A slow but steady and sustained blood loss in a child during surgery may result in few or subtle changes in hemodynamics for some time until a rapid decompensation occurs. If the weak signs of the developing problem were not detected, the ensuing catastrophe may seem to have occurred "suddenly."

Precompiled Responses and Abstract Reasoning

Having recognized a problem, how does the expert anesthetist respond? The classic paradigm of decision-making posits a careful comparison of the evidence with various causal hypotheses that can explain them.[61] This is then followed by a careful analysis of all possible actions and solutions to the problem. This approach, although powerful, is relatively slow and does not work well with ambiguous or scanty evidence. In complex, dynamic domains such as anesthesia, many problems require "decisions under uncertainty"[78] [128] with quick action to prevent a rapid cascade to a catastrophic adverse outcome. For these problems, deriving a solution through formal deductive reasoning from "first principles" is too slow.

In complex, dynamic domains, the initial responses of experts to the majority of events stem from precompiled rules or response plans for dealing with a recognized event.[82] [96] [98] This method is referred to as recognition-primed decision-making,[86] [129] because once the event is identified, the response is well known. In the anesthesia domain, these responses usually are acquired through personal experience alone, although there is a growing realization that critical response protocols need to be codified explicitly and taught systematically.[67] Experienced anesthetists have been observed to rearrange, recompile, and rehearse these responses mentally based on the patient's condition, the surgical procedure, and the problems to be expected.[63] [69] Ideally, precompiled responses to common problems are retrieved appropriately and are executed rapidly. When the exact nature of the problem is not apparent, a set of generic responses appropriate for the overall situation may be invoked. For example, if a problem with ventilation is detected, the anesthetist may switch to manual ventilation using a higher FIO2 while considering further diagnostic actions.

However, experiments involving screen-only[130] and realistic[131] [132] [133] [134] [135] simulators have demonstrated that even experienced anesthetists show great variability in their


3028
use of response procedures to critical situations. This finding led these investigators to target simulator-based training on the systematic training in responses to critical events.[65] [132] [136] [137] [138]

Even the ideal use of precompiled responses is destined to fail when the problem does not have the suspected cause or when it does not respond to the usual actions. Anesthesia cannot be administered purely by precompiled "cookbook" procedures. Abstract reasoning about the problem utilizing fundamental medical knowledge still takes place in parallel with precompiled responses even when quick action must be taken. This seems to involve a search for high-level analogies[84] [102] or true deductive reasoning using deep medical and technical knowledge and a thorough analysis of all possible solutions. Anesthetists managing simulated crises have linked their precompiled actions to abstract medical concepts.[130] It is unclear whether this represented merely self-justification, as opposed to true abstract reasoning, in part because the particular simulated crises they faced did not require novel abstract solutions. At this time, the degree to which abstract reasoning is necessary for optimal intraoperative crisis management is unknown.[37] [65] [66] [85] [128] [139] [147]

Taking Action

A hallmark of anesthesia practice is that anesthetists do not just write orders in a patient's chart; they are directly involved in implementing the desired actions.

Although such direct involvement has many benefits in timeliness and flexibility of action, it also poses risks. Action implementation can usurp a large amount of the anesthetist's attention and can be distracting. This is particularly an issue when other tasks have been interrupted or temporarily suspended. The "prospective memory" to complete those tasks can be erased. (For a more detailed explanation of prospective memory, see below.) In addition, anesthetists engaged in a manual procedure are strongly constrained from performing other manual tasks, as demonstrated in several of the mental workload and vigilance studies described earlier.

Errors in executing a task are termed slips, as distinguished from errors in deciding what to do, which are termed mistakes.[90] [140] Slips are actions that do not occur as planned, such as turning the wrong switch or making a syringe swap. Thus, when critical incident [141] [142] and quality assurance studies described "technical errors" in using equipment, they were referring to slips, whereas "judgment errors" referred to mistakes. One particular type of execution error, termed a mode error,[93] is becoming more frequent in all domains with the increased use of micro-processor-based instrumentation and devices.[68] [152] [153] [154] In a mode error, actions appropriate for one mode of a device's operation are incorrect for another mode. An example in anesthesiology is the "bag/ventilator" selector valve in the anesthesia breathing circuit, which selects between two modes of ventilation. Failing to activate the ventilator when in the "ventilator mode" can be catastrophic. Mode errors can also occur in monitoring or drug delivery devices if they assign different functions to the same displays or switches depending on the mode of operation selected.

Particularly dangerous slips of execution can be addressed through the use of engineered safety devices[146]
TABLE 83-4 -- Reevaluation questions—maintaining situation awareness
1. Was the initial situation assessment or diagnosis correct?
2. Did the actions have any effect? (e.g., did the drug reach the patient?)
3. Is the problem getting better, or is it getting worse?
4. Are there any side effects resulting from previous actions?
5. Are there any new problems or other problems that were missed before?
6. What further developments can be expected in the (near) future?

that physically prevent incorrect actions. For example, newer anesthesia machines have interlocks that physically prevent the simultaneous administration of more than one volatile anesthetic drug. Other interlocks physically prevent the selection of a gas mixture containing less than 21% O2 .

Certain very complex issues concerning human-machine interactions and the ways in which technology affects behavior in complex patient care environments are beyond the scope of this chapter. Other publications address these issues. *

Reevaluation

In order to cope with the rapid changes and the profound diagnostic and therapeutic uncertainties seen during anesthesia, the core process must include repetitive reevaluation of the situation. Thus, the reevaluation step returns the anesthetist to the observation phase of the core process, but with specific assessments in mind, as shown in Table 83-4 .

The process of continually updating the situation assessment and of monitoring the efficacy of chosen actions is termed situation awareness.[95] [116] [117] [118] [119] [149] [150] Situation awareness is a very interesting and important topic in analyzing performance and reasons for errors.[71] [151] [152] [153] There has been a review of situation awareness issues in anesthesiology.[71]

Prospective Memory

Prospective memory describes one's ability to remember in the future to perform an action. It is particularly prone to disruption by concurrent tasks or interruptions. Disruption of intentions or of on-going tasks is common in everyday life and has also been described in pilots and air traffic controllers [154] [155] [156] [157] [158] In anesthesia for example, if the anesthetist suspends ventilation temporarily (say to allow an x-ray to be taken) the intention to re-start the ventilator depends on prospective memory and can be easily forgotten. Chisholm and associates performed a study in an emergency department and looked for "interruptions" and "breaks-in-task." They found that during a 3-hour period the emergency physician was faced with more than 30 interruptions and more than 20 breaks-in-task. It is likely that similar results would be found in the ICU.[159]


*See references [39] [41] [42] [44] [45] [46] [47] [48] [50] [51] [54] [68] [144] [147] [148] .

3029

A variety of methods may guard prospective memory of intentions. Visual or auditory reminders can be used (alarms of physiologic monitors often serve this purpose whether intended or not), although the effectiveness of such methods seems to be less than one would expect. Special actions—such as leaving one's finger on the ventilator switch—can be used to indicate that an important intention is pending.

Previous Next