Previous Next

SIMULATION FOR EDUCATION AND TRAINING OF MEDICAL AND OTHER HEALTH PROFESSION STUDENTS

The various available patient simulators offer an unparalleled opportunity for students to experience physiologic and pathophysiologic reactions in a realistic manner. Different objectives are possible for teaching sessions. A study by Morgan and Cleave-Hogg concluded that "the simulator environment is somehow unique and allows different behaviors to be assessed."[77] The objectives of "education" are to provide conceptual understanding and an introduction to skills, whereas the objective of "training" is to implement specific skills and behaviors applicable directly to real-world surroundings. Consistency in objectives is important for evaluating education and training; for example, if we are teaching conceptual understanding but assessing specific skills, we may obtain misleading results.[109]

A recent study by Coates and colleagues demonstrated that simulator education for medical students on patient management skills related to dyspnea was superior to problem-based learning.[110] A variety of curricula exist,[42] [109] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] including

  1. Educational sessions and demonstrations concerning basic cardiopulmonary physiology or pharmacology; in some institutions, these sessions replace animal laboratory exercises.[121]
  2. Introduction to integrated management (i.e., interweaving of diagnosis and treatment) of critically ill patients; this training is offered to immediate preclinical medical students as part of their preparation for clinical medicine courses.
  3. Anesthesia practicum for clinical students during anesthesia or intensive care clerkships.[122]
  4. Problem-solving skills training. Simulation has been tested as a means of educating and training students in the appropriate response to specific critical events. Morgan and coworkers tested simulation versus instructor-facilitated viewing of a videotape for this purpose[123] and found no difference between the two. However, the generalizability of this finding has been questioned.[109]
  5. Adaptation of curricula originally aimed at experienced trainees or practicing clinicians to students. There is increasing interest in providing early exposure to medical students, possibly also students of nursing or allied health professions, to interdisciplinary team aspects of medical practice. The aim is to inculcate concepts of teamwork and communication at the earliest stages of education and training as a vehicle to implement long-term culture change in the health care system.

Simulation for Training of Anesthesia Residents

The use of patient simulation for the education and training of anesthesia residents has exploded over the past 10 years. Even without definitive proof of effectiveness, many residency programs voted with their feet and established simulator training programs.[123A] [123B] The Netherlands and Belgium, as well as Sweden, are about to make ACRM courses compulsory, as they now are in Denmark. The German Anesthesia Society recently decided to supply all university hospitals in Germany with patient simulators for the establishment of widespread training of medical students and anesthesia residents. Their guidelines have been published (in German, but an English translation of the standards is available at www.medizin.uni-tuebingen.de/psz/english/).

Published Experience with Simulation for Residents

The screen-based Anesthesia Simulator (Anesoft, WA) with built-in debriefing capabilities, formerly known as the Anesthesia Simulator Recorder or Consultant, was highly rated by residents and faculty and is now commercially available as a resource for many anesthesia training institutions. When asked, the vast majority of residents at the University of Washington expressed recognition of their own need for additional training in handling critical incidents and the likelihood that the Anesthesia Simulator Consultant exercises would help them. Nonetheless, it was found that only 20% of residents completed the prescribed set of 12 simulations on a voluntary basis. The University of Washington program then made these exercises mandatory and required faculty to review the printed logs of each simulated case and provide the residents with appropriate critique (H. Schwid, personal communication). In a study performed by Schwid and colleagues (Schwid is the owner of Anesoft Corporation), use of the Anesthesia Simulator 3.0 with written feedback on the printed case log of the sessions was compared with just written instructions about the same clinical scenarios in first-year anesthesia residents.[124] After completion of the relevant educational interventions, both groups were evaluated by means of a mainly technical score in scenarios conducted with a mannequin-based simulator. Out of 95 possible points, the group that used the PC simulator scored 52.6 (±9.9), whereas the group given written information group scored only 43.4 (±5.9), which was statistically significant. According to previous work by Cohen and Dacanay,[125] the "effect size" was calculated


3091
and also showed a good value of 1.54. The results suggest that exercises with PC simulators accompanied by written feedback and faculty review may well be a useful element in a resident training program. It can perhaps best be used as preparation for or a supplement to training involving full-scale scenarios with a mannequin-based simulator. The cost for PC-based training is low, and the exercises can be performed by residents nearly anytime or anywhere.

Good and coauthors[126] reported on a randomized study of the effect of simulator-based training on novice resident performance. Twenty-six beginning residents matched for previous training and gender were randomized to receive either daily simulator training sessions or daily lectures on 10 predefined learning objectives during weeks 2 and 3 of their residency. Resident performance was assessed both by a written test and by evaluations of clinical ability by supervisors at weeks 1, 3, 8, and 13 of training. No difference was found in performance on the written test. Although a trend toward improvement in the overall raw clinical ability scores was noted for the simulator group at weeks 3 and 8, it was not statistically significant. However, the change in clinical score was significantly greater for the simulator group than for the lecture group at weeks 3 and 8, but not at week 13, by which time the change in clinical score was identical for the two groups. This study suggests that simulator-based training may lead to somewhat faster improvement in clinical ability, but that by 3 months of anesthesia experience, all residents had improved their ability by the same amount. It was noted that additional work is needed to refine and coordinate the clinical and simulator curricula and also that continuing simulator exercises may have benefits beyond the single "bolus" of training over a period of 2 weeks.

The Leiden group[67] reported that simulator-based training and practice in the management of malignant hyperthermia led to significant improvement in the handling of this condition when participants were tested in the simulator 4 months later. All participants had been given simulator training, but only half received training on malignant hyperthermia. This design partially controlled for effects resulting solely from familiarity with the simulator environment. Gonzalez and Schaefer[127] [128] reported training residents (and others) in the American Society of Anesthesiologists (ASA) difficult airway management algorithm with an Eagle Patient Simulator. O'Brien and coauthors reported an interview study of interns about their experience with real-world cardiac arrest situations after simulation training on cardiac arrest. The results of the qualitative study were very positive toward the technical, nontechnical, and self-confidence aspects.[42]

Forrest and colleagues performed a study to test development of the technical performance of novice residents.[69] The study, which is described in more detail earlier (see the section "Evaluation of Performance during Simulation Scenarios"), showed a significant increase in technical score between weeks 1 and 12, but not between weeks 2, 4, or 8. This result could mean that the scoring system is not fine grained enough to detect subtle changes in performance. In addition, the results showed differences between week 12 and experienced experts.

Schwid and associates and the Anesthesia Research Consortium performed an unparalleled multicenter study to validate mannequin-based simulation for the evaluation of anesthesia residents. Ninety-nine anesthesia residents with different levels of experience from 10 institutions in the United States were videotaped during the management of four scenarios that had previously been used in screen-based simulations. [124] These tapes were evaluated for technical and medical management (but not nontechnical or behavioral skills) by two local evaluators and one evaluator from another institution using both a long and a short evaluation form. Another evaluator rated all tapes for case management errors.

The study demonstrated construct validity shown by (1) correlation of performance scores with experience of the residents (from clinical base year through the end of residency, (2) a realistic rating for the simulator scenarios by the participants (3.47 out of 4), (3) moderate criterion-related validity as demonstrated by moderate correlation (0.37 to 0.49) of simulator scores with other evaluations (departmental faculty evaluation, American Board of Anesthesiology written in-training scores, mock oral board scores), and (4) good internal consistency of assessments (Cronbach alpha values of 0.71 to 0.76) and excellent inter-rater reliability (values of 0.94 to 0.96). The authors state that these values are considered excellent and sufficient for high-stakes evaluations.

The results were independent of the type of simulator used (METI HPS versus Medsim), although some shortcomings of both simulators were noted that limited the veracity of certain important cues (breath sounds, electrocardiographic waveforms, and capnograms), which may affect experienced clinicians more.

Numerous management errors were identified in residents from all institutions and at all levels of experience, similar to what has been found in other simulation studies of personnel.[68] [69]

This study suggests that the use of simulators for evaluation of anesthesia residents may be feasible, although some improvements are still possible. However, it remains unclear whether evaluating only technical performance will be sufficient to fully characterize professional competency because nontechnical skills may be equally important.

Previous Next