Jump to content

Medical simulation

From Wikipedia, the free encyclopedia
(Redirected from Medical Simulation)
An NSHQ [de] instructor shows a SOF medic the proper procedure for controlling a mannequin.

Medical simulation, or more broadly, healthcare simulation, is a branch of simulation related to education and training in medical fields of various industries. Simulations can be held in the classroom, in situational environments, or in spaces built specifically for simulation practice.[1] It can involve simulated human patients (whether artificial, human or a combination of the two), educational documents with detailed simulated animations, casualty assessment in homeland security and military situations, emergency response, and support for virtual health functions with holographic simulation. In the past, its main purpose was to train medical professionals to reduce errors during surgery, prescription, crisis interventions, and general practice. Combined with methods in debriefing, it is now also used to train students in anatomy, physiology, and communication during their schooling.

History

[edit]
Link Trainer.

Modern-day simulation for training was first utilized by anesthesia physicians to reduce accidents.[2] When simulation skyrocketed in popularity during the 1930s due to the invention of the Trainer Building Link Trainer for flight and military applications, many field experts attempted to adapt simulation to their own needs. Medical simulation was not immediately accepted as a useful training technique, both because of technological limitations and because of the limited availability of medical expertise at the time.[2] However, extensive military use demonstrated that medical simulation could be cost-effective. Additionally, valuable simulation hardware and software was developed,[which?] and medical standards were established. Gradually, medical simulation became affordable, although it remained un-standardized.[2][better source needed]

By the 1980s software simulations became available. With the help of a UCSD School of Medicine student, Computer Gaming World reported that a Surgeon (1986) for the Apple Macintosh very accurately simulated operating on an aortic aneurysm.[3] Others followed, such as Life & Death (1988).

In 2004, the Society for Simulation in Healthcare (SSH) was formed to assist in collaboration between associations interested in medical simulation in healthcare.[4]

The need for a "uniform mechanism to educate, evaluate, and certify simulation instructors for the health care profession" was recognized by McGaghie et al. in their critical review of simulation-based medical education research.[5] In 2012 the SSH piloted two new certifications to provide recognition to educators to meet this need.[6]

Modern medical simulation

[edit]

The American Board of Emergency Medicine employs the use of medical simulation technology in order to accurately judge students by using "patient scenarios" during oral board examinations.[2] However, these forms of simulation are a far cry from high-fidelity models that have surfaced since the 1990s.[7]

Due to the fact that computer simulation technology is still relatively new with regard to flight and military simulators, there is still much research to be done about the best way to approach medical training through simulation, which remains un-standardized despite having been embraced generally by the medical community. That said, successful strides are being made in terms of medical education and training, although a number of studies have shown that students engaged in medical simulation training have overall higher scores and retention rates than those trained through traditional means.[2]

The Council of Residency Directors (CORD) has established the following recommendations for simulation:[2]

  1. Simulation is a useful tool for training residents and ascertaining competency. The core competencies most conducive to simulation-based training are patient care, interpersonal skills, and systems-based practice.
  2. It is appropriate for performance assessment but there is a scarcity of evidence that supports the validity of simulation in the use for promotion or certification.
  3. There is a need for standardization and definition in using simulation to evaluate performance.
  4. Scenarios and tools should also be formatted and standardized such that EM educators can use the data and count on it for reproducibility, reliability, and validity.

The Association of Surgeons in Training has produced recommendations for the introduction, availability, and role of simulation in surgical training.[8]

Clinical Skills and Simulations Centers (CSSC) for medical simulation

[edit]

The two main types of medical institutions that train people through medical simulations are medical schools and teaching hospitals. According to survey results from the Association of American Medical Colleges (AAMC), simulation content taught at American medical schools spans all four years of study, while hospitals utilize simulations during the residency and subspecialty period. Internal medicine, emergency medicine, obstetrics/gynecology, pediatrics, surgery, and anesthesiology are the most common areas taught in medical schools and hospitals.[9]

The AAMC reported that the majority of medical schools and teaching hospitals centralize their simulation activities at a single physical location, while some use decentralized facilities or mobile simulation resources. Most of the medical training institutions own their own facilities.[9] Often, medical school CSSC locations include rooms for debriefs, training exercises, standardized exam and patient rooms, procedure rooms, offices, observation area, control rooms, classrooms, and storage rooms. On average, a medical school dedicates 27 rooms of its CSSC to training with simulations.[9]

Medical simulation centre design and operations

[edit]

A medical simulation centre is an educational centre in a clinical setting. The key elements in the design of a simulation center are building form, room usage, and technology.[10] For learners to suspend disbelief during simulation scenarios, it is important to create a realistic environment. It may include incorporating aspects of the environment not essential in simulation activities, but that play a big role in patient safety. For instance, many reports show that patient falls and injuries occur in the hospital bathroom, so the simulation rooms were designed with bathroom spaces.[11] A successful simulation center must be within walking distance of the medical professionals who will be using it.[12]

Often, clinical and medical faculty are responsible for the day-to-day operations of simulation centers, typically in addition to other responsibilities. However, the technology that has emerged within medical simulation has become complex and can benefit from the utilization of specialists. In 2014, Society for Simulation in Healthcare introduced the Certified Healthcare Simulation Operations Specialist (CHSOS) certification. The CHSOS certification endeavors to standardize and authenticate the minimum competencies to be demonstrated by simulation center operations specialists.[13]

Debriefing and education in medical simulation

[edit]
Example of a Medical Simulation

The origins of debriefing can be traced back to the military, whereby upon return from a mission or war game exercise, participants were asked to gather as a group and recount what had happened.[14] These gatherings had the primary intention of developing new strategies to use in future encounters; these gatherings also provided a learning opportunity for other members of the team who were not present at the events being debriefed.

In the field of psychology, debriefing is used in the processing of traumatic events. Here, the emphasis is on the narrative; in a facilitator-led environment, participants reconstruct what happened and review facts, share reactions, and develop a shared meaning of the events. The aim is to reduce stress, accelerate normal recovery, and assist in both the cognitive and emotional processing of the experience.[15][16]

In all instances, debriefing is the process by which people who have gone through an experience are intentionally and thoughtfully led through a discussion of that experience.[17][14] Debriefing in simulation is a critical component of learning in simulation and is necessary to facilitate change "on an individual and systematic level".[18][19]: e287  It draws from the above-mentioned forms of debriefing, but the emphasis here is on education. Debriefing in education can be described as a "facilitator-led participant discussion of events, reflection, and assimilation of activities into [participants'] cognitions [which] produce long-lasting learning".[1] More specific descriptions of debriefing can be found, such as the following in relation to debriefing in healthcare simulations, described by Cheng et al. (2014): "...a discussion between two or more individuals in which aspects of a performance are explored and analysed with the aim of gaining insights that impact the quality of future clinical practice".[20]: 658  Or another regarding debriefing in gaming, by Steinwachs (1992), "...a time to reflect on and discover together what happened during game play and what it all means."[21]: 187 

Debriefing in medical simulation

[edit]

Medical simulation is often defined as, "a technique (not a technology) to replace and amplify real life experiences with guided ones, often "immersive" in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion".[22] This definition deliberately defines simulation as a technique and not a technology, implying that simulation is greater than the technology or tools which it adopts. Also note the use of the word guided in the definition, further implying that the interactions which occur in a simulated environment are not left solely to those persons immersed in the simulation, but that a "guide" also be present. This guide may be virtual in nature, such as prompts from a computer program, or may be physically present, in the form of an instructor or teacher. The human guide is often referred to as a "facilitator".[1] It is this facilitator who guides the debriefing which occurs after a simulation scenario has been completed.

When these elements are present, the simulation is often referred to as "Instructional simulation", "Educational simulation," or "Simulation-based learning".[1] Favorable and statistically significant effects for nearly all knowledge and process skill outcomes when comparing simulation AND debriefing versus simulation with no intervention (in healthcare) has been shown.[20] When applied in a capacity to further professional development, simulation and debriefing may be referred to as "Simulation-based training".[23]

Simulation, debriefing, and education theory

[edit]

Experiential learning, which draws from prominent scholars such as John Dewey, Jean Piaget, and Carl Rogers, among others,[24] underpins simulation-based learning.[1][17][25][26] Often referred to as "learning by doing",[1] or more broadly, a "theory of experience",[27] Experiential Learning Theory states that experience plays a central role in human learning and development.[24] The six principles of Experiential Learning Theory align with educational simulation. The six principles are:

  1. Engaging students in a process that enhances learning. This includes "feedback on the effectiveness of their learning efforts," (p. 194) and focus on the process, not the outcome.
  2. Students have prior beliefs and ideas. A process which draws these beliefs and ideas out, with the intent of re-examining and re-testing them against a topic in order to accommodate new ideas, will lead to learning.
  3. Learning is a process which cycles between reflection and action, feeling and thinking. "Conflict, differences, and disagreement are what drive the learning process" (p. 194); the resolution of these is what leads to learning.
  4. Learning happens in interactions between the person and the environment which surrounds them.
  5. Learning is more than cognition; it also involves thinking, feeling, perceiving, and behaving.
  6. Learning is grounded in constructivist philosophy; "Learning is the process of creating knowledge".[24]: 194 

Simulation also aligns with Guided Discovery learning. Developed by Jerome Bruner in the 1960s, discovery learning also stems from the work of Jean Piaget and can be described as a learning environment where there is little to no instructor-guidance.[28] Guided discovery learning, on the other hand, continues to place learners in a discovery environment, but where an instructor is available to help guide learning via coaching, feedback, hints, or modeling.[28]

Both Experiential and Discovery Learning are based on constructivist philosophy.[24][28] Broadly, Constructivism is based on the belief that learning is an active process whereby learners make sense of new knowledge by building upon their prior experiences; each person has a unique set of experiences which frame their interpretation of information.[29]

Debriefing frameworks

[edit]

While many models for debriefing exist, they all follow, at a minimum, a three-phase format.[1][17][25] Debriefing models can be divided into two categories: the "Three-Phase Debriefing Structure," and the "Multiphase Debriefing Structure".[25]

Three-Phase debriefing structure

[edit]

A benchmark in all forms of facilitator-guided, post-event debriefing conversational structures, the three conventional phases of debriefing are: description, analysis, and application.[1][30][25] Frameworks which make use of the three-phase debriefing format include Debriefing with Good Judgment,[31] the 3D Model,[26] the GAS model,[32] and Diamond Debrief.[30]

Description
[edit]

Also labelled as "reaction,"[33][34][31] "defusing,"[26] "gather,"[32] and "identify what happened,"[18] the description phase of debriefing sees simulation participants describing and exploring their reactions, emotions, and overall impact of the experience.[1][25] It is the opening phase of systematic reflection, enabled by a facilitator who poses key questions such as:

  • "How did that feel?"
  • "How did that go?"
  • "Can you take us through the scenario as it unfolded?"[32][25]

A facilitator is to keep asking these questions of the learners until they feel confident that all participants have voiced their understanding of the situation.[30] The point of the description phase is to identify the impact of the experience, gain insights into what mattered to the participants throughout the simulation, and to establish a shared mental model of the events which occurred.[34][25][17] A debate in the healthcare simulation community exists regarding the exploration of feelings in the descriptive phase. One camp believes that the descriptive phase should allow an opportunity for participants to "blow off steam," and release any tension which may have accumulated during the simulation scenario in order for learners to continue the debrief and subsequent reflection without pent-up emotion.[25][34][26] Others believe that the "venting" phase is not necessary and may explicitly make this statement in their debriefing models, or simply omit any reference to emotions or feelings at all.[30]

Analysis
[edit]

The second phase of debriefing is often referred to as "analysis,"[1][17][32][31][25] "description,"[33] or "discovering".[26] This is the phase in which the bulk of the time of debriefing is spent, with a focus on participant performance, rationales, and frames.[33][30][34][18] It is meant to be a time of reflective practice on what actually occurred during the scenario, and the reasons why events unfolded as they did.[31][25] The analysis phase uncovers the decision-making process behind observed actions.[26] Common questions posed, or statements made, by a facilitator during this phase include:

  • "Tell me about [insert performance/event here, i.e. teamwork] during the scenario."[25]
  • "What went well? Why?"
  • "What made things challenging?"
  • "Why do you think that happened?"

Participant performance is a key component during the analysis phase. However, performance can often be a difficult topic to broach with participants, as criticism or constructive feedback often incur negative feelings. There exists a framework for questioning named "Advocacy-Inquiry," or the "debriefing with good judgment" approach, which aims to reduce negative experiences in medical simulation debriefing.[31]

Advocacy Inquiry. The use of advocacy-inquiry (AI) questioning is highly encouraged by nearly all authors of debriefing models.[33][34][31][26] Advocacy-inquiry consists of pairing "an assertion, observation, or statement" (advocacy), together with a question (inquiry), in order to elicit the mental frameworks – or schema – of both the facilitator and the participants.[31]: 53  In phrasing questions this way, participants are made aware of the facilitator's own point of view in relation to the question being posed. Note that the use of AI is most encouraged when a facilitator has a judgment about something which was observed during the simulation scenario. Using AI eliminates the tone of judgment as well as the "guess what I'm thinking" which can occur when asking questions.

Application
[edit]

The third and final phase of three-phase debriefing structures is most commonly referred to as "application,"[1][30] or "summary".[33][34][32][31] Participants are asked to move any newly acquired insights or knowledge gained throughout the simulation experience forward to their daily activities or thought processes.[1][30][17][31][25] This includes learning which may have occurred during the previous phases in the debriefing process. Common questions posed, or statements made, by a facilitator during this phase include:

  • "What are you going to do differently in your practice tomorrow?"[30]
  • "What new insights have you gained?"
  • "What one thing will you commit to doing differently after this?"

Note that the summary here is not always in terms of re-stating the major points which were visited throughout the simulation and debrief, but more so emphasize the greatest impact of learning. The summary may be done by either the facilitator or the participants – debriefing models differ in their suggestions. In the latter, the participants summarize what was of most value for them.[33][30][18] A summary by the facilitator consists of re-stating key learning points which occurred throughout the debrief.[33][26][34]

Multi-Phase debriefing structure

[edit]

While all debriefing models include the phases of the three-part debriefing structure, there are several with additional phases. These additions either explicitly call out specific features which may be included in the three-part debriefing model, such as reviewing learning objectives, or provide additional process recommendations, such as immediately re-practicing any skills involved in the original simulation scenario.[18][34] Examples of multi-phase debriefing structures include the Promoting Excellence and Reflective Learning in Simulation (PEARLS) framework,[33] TeamGAINS,[34] and Healthcare Simulation After-Action Review (AAR).[18]

Learning objectives

[edit]

As with any other educational initiative, learning objectives are of paramount importance in simulation and debriefing. Without learning objectives, simulations themselves and the subsequent debriefs are aimless, disorganized, and often dysfunctional. Most debriefing models explicitly make mention of stating learning objectives.[33][30][34][18][26]

The exploration of learning objectives ought to answer at least two questions: What competencies – knowledge, skills, or attitudes – are to be learned, and what specifically should be learned about them?[1] The method of debriefing chosen should align with learning objectives through evaluation of three points: performance domain – cognitive, technical, or behavioral; evidence for rationale – yes/no; and estimated length of time to address – short, moderate, or long.[33]

Learning objectives may be predetermined and included in the development of a simulation scenario, or they may be emergent as the scenario unfolds.[33][1] It can be challenging for the novice facilitator to adapt to emergent learning objectives, as the subsequent discussion may be purely exploratory in nature with no defined outcome. Conversely, the discussion may lead to a specific area of expertise which neither the facilitator nor participants are familiar with. In such situations, the facilitator and participants must be flexible and move on to the next objective, and follow-up with the debriefing of the emergent outcome at a later time.

Environment

[edit]

The debriefing environment consists of two main features: the physical setting, as well as the psychological environment.

Physical setting

[edit]

When choosing a space in which to debrief, one must consider whether the scenario which unfolded was a complex case. Complex cases usually involve heightened emotions, interdependent processes, and require more time spent debriefing. As such, it is recommended that these types of debriefings occur in a separate room from where the simulation scenario took place. This allows for a release of tension as participants move from one place to another and encounter new surroundings.[1] Note, however, that it is important to remind participants not to begin debriefing during the walk to the new room. The momentum of the simulation leads participants to begin debriefing with one another as soon the scenario has finished.[21] However, in order to establish a shared mental model with all participants, debriefing must occur in a fashion whereby all participants can hear one another and have a chance to respond. This is difficult to accomplish while walking down a hallway, or in any disorganized fashion.

The location of the debriefing is ideally somewhere comfortable and conducive to conversation and reflection, where chairs can be maneuvered and manipulated.[1][18] It is recommended that, during the debriefing, the facilitator(s) or participants be seated in a circle.[21] This is done so that everyone can see each other and increase group cohesion. Furthermore, the use of a circle implies equality among the group, and decreases any sense of hierarchy which may be present.

Psychological environment

[edit]

Establishing psychological safety and a safe learning environment is of utmost importance within both the simulation and the debriefing period.[33][1][31][35][25][26] As simulation participants often find the experience stressful and intimidating, worried about judgment from their peers and facilitator(s), establishing safety must be done from the outset of the simulation event.[36] Note that psychological safety does not necessarily equate to comfort, but rather that participants "feel safe enough to embrace being uncomfortable...without the burden of feeling that they will be shamed, humiliated, or belittled".[35]: 340 

It is recommended that establishing safety begin in the pre-brief phase[35] by alerting participants to the "basic assumption." The basic assumption, derived from the Centre for Medical Simulation at Harvard University (n.d.), is an agreed upon, predetermined mental model whereby everyone involved in the simulation & debrief believe that all participants are intelligent, well-trained, want to do their best, and are participating to learn and promote development.[37][25] Additionally, Rudolph et al. (2014) have identified four principles to guide the formulation of a psychologically safe environment:

  1. Communicate clear expectations
  2. Establish a "fiction contract"
  3. Attend to logistic details
  4. Declare and enact a commitment to respecting learners and concern for their psychological safety[35]

Included in these principles is the notion of confidentiality. Explicitly reminding participants that their individual performance and debriefing reflections are not meant to be shared outside of the simulation event can help foster participation. Confidentiality builds trust by increasing transparency and allowing participants to practice without fear.[1][35]

Evidence and further study

[edit]

There exists a paucity of quantitative data regarding the effectiveness of debriefing in medical simulation,[1][20][38] despite Lederman's 1992 seminal Model for the Systematic Assessment of Debriefing.[17] Nearly every article reviewed had a cry for objective studies regarding the effectiveness of debriefing, whether it be comparing: the myriad options of conversational structures,[25] debriefing models,[34] or the comprehensive 5 W's of Who – debriefer, What – content and methods, When – timing, Where – environment, and Why – theory.[38]

Currently, there are critical limitations in the presentation of existing studies, a sparsity of research related to debriefing topics of importance, and debriefing characteristics are incompletely reported.[38][20] Recommendations for future debriefing studies include:

  • Duration of debriefing
  • Educator presence
  • Educator characteristics
  • Content of debriefing
  • Structure and method of debriefing
  • Timing of debriefing[20]

or:

  • Who: debriefer number and characteristics
  • What: the purpose of the debrief, formative vs summative assessment, individual vs team debriefing, method of debriefing, content covered, mechanics, etc.
  • When: duration, post-event vs during-event vs delayed, etc.
  • Where: in-situ, separate room, hospital, learning centre, etc.
  • Why: theoretical underpinning of the debriefing model chosen and rationale
  • PICO: population, intervention, comparator, outcome[38]

Current research has found that simulation training with debriefing, when compared with no intervention, had favorable, statistically significant effects for nearly all outcomes: knowledge, process skill, time skills, product skills, behavior process, behavior time, and patient effects. When compared with other forms of instruction, simulation and debriefing showed small favorable effects for knowledge, time and process outcomes, and moderate effects for satisfaction.[20]

Types of simulations used in medical schools and teaching hospitals

[edit]

There many different types of simulations that are used for training purposes. Some of the most known are the use of mannequins (referred to by the simulation company METI as Human Patient Simulators, or HPS for short) and standardized patients.

As seen in the chart titled "Types of Simulation Used in Medical Education" retrieved from the AAMC article, medical schools are leading the way when it comes to the use of standardized patients, but teaching hospitals and medical schools are close when it comes to full-scale mannequins and partial task trainers.

Medical simulation efficiency in education

[edit]

According to a study conducted by Bjorn Hoffman, to find the level of efficiency of simulation based medical training in a hi-tech health care setting, "simulation's ability to address skillful device handling as well as purposive aspects of technology provides a potential for effective and efficient learning."[39] More positive information is found in the article entitled, "The role of medical simulation: an overview," by Kevin Kunkler. Kunkler states that, "medical simulators can be useful tools in determining a physician's understanding and use of best practices, management of patient complications, appropriate use of instruments and tools, and overall competence in performing procedures."[40]

Training

[edit]

The main purpose of medical simulation is to properly educate students in various fields through the use of high technology simulators. According to the Institute of Medicine, 44,000 to 98,000 deaths annually are recorded due primarily to medical mistakes during treatment.[41] Other statistics include:

  • 225,000 deaths annually from medical error including 106,000 deaths due to "non-error adverse events of medications"[42]
  • 7,391 deaths resulted from medication errors

If 44,000 to 98,000 deaths are the direct result of medical mistakes, and the CDC reported in 1999 that roughly 2.4 million people died in the United States, the medical mistakes estimate represents 1.8% to 4.0% of all deaths, respectively.[43]

A near 5% representation of deaths primarily related to medical mistakes is simply unacceptable in the world of medicine. Anything that can assist in bringing this number down is highly recommended and medical simulation has proven to be the key assistant.

The use of high-fidelity simulation for health professional education is strongly recommended by the WHO because it leads to greater acquisition, retention, and transfer of technical and non-technical skills.[44] In addition to reducing error, simulation is commonly used in medical and nursing education to prepare health professionals to perform sensitive exams such as the breast or pelvic exam or to assist with breastfeeding.[45][46]

See also

[edit]

References

[edit]
  1. ^ a b c d e f g h i j k l m n o p q r s Fanning, Ruth M.; Gaba, David M. (2007). "The Role of Debriefing in Simulation-Based Learning". Simulation in Healthcare. 2 (2): 115–125. doi:10.1097/SIH.0b013e3180315539. PMID 19088616. S2CID 18613707.
  2. ^ a b c d e f Chakravarthy, Bharath. Academic Resident. Medical Simulation in EM Training and Beyond
  3. ^ Boosman, Frank (November 1986). "Macintosh Windows". Computer Gaming World. p. 42. Retrieved 1 November 2013.
  4. ^ Richard H. Riley (2008). Chapter 38: Society for Simulation in Healthcare by Raemer, Dan IN: Manual of Simulation in Healthcare. Oxford University Press. pp. 532–. ISBN 978-0-19-920585-1.
  5. ^ McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ (2010). "A critical review of simulation-based medical education research: 2003–2009". Medical Education. 44 (1): 50–63. doi:10.1111/j.1365-2923.2009.03547.x. PMID 20078756. S2CID 228055.
  6. ^ Struijk, Jennie (2013-04-11). "Certified Healthcare Simulation Educator (CHSE) – an update for ASPE". Association of Standardized Patient Educators News. Retrieved 2015-12-27.
  7. ^ Ahmed K, Jawad M, Abboudi M, Gavazzi A, Darzi A, Athanasiou T, Vale J, Khan MS, Dasgupta P (2011). "Effectiveness of Procedural Simulation in Urology: A Systematic Review". J. Urol. 186 (1): 26–34. doi:10.1016/j.juro.2011.02.2684. PMID 21571338.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  8. ^ Milburn, J.A.; Khera, G.; Hornby, S.T.; Malone, P.S.C.; Fitzgerald, J.E.F. (2012). "Introduction, availability and role of simulation in surgical education and training: Review of current evidence and recommendations from the Association of Surgeons in Training". International Journal of Surgery. 10 (8): 393–398. doi:10.1016/j.ijsu.2012.05.005. PMID 22609475.
  9. ^ a b c Passiment, Morgan; Sacks, Heather; Huang, Grace (September 2011). "Medical Simulation in Medical Education: Results of an AAMC Survey". Association of American Medical Colleges.
  10. ^ H. Riley, Richard (2015-10-29). The Manual of Healthcare Simulation. Oxford University Press. p. 16. ISBN 978-0-19-871762-1. Retrieved 13 May 2019.
  11. ^ Dodson, Adam; Chi Stone, Vivian. "Planning a simulation center". Health Facilities Management Magazine. Retrieved 13 May 2019.
  12. ^ Eagle, Amy. "Principles for efficient simulation center layouts". Health Facilities Management Magazine. Retrieved 13 May 2019.
  13. ^ T. Gantt, Laura; Young, H. Michael Young (2015-12-14). Healthcare Simulation: A Guide for Operations Specialists. John Wiley & Sons. pp. 164, 5. ISBN 978-1-118-94941-2.
  14. ^ a b Pearson, Smith D (1986). "Debriefing in experience-based learning". Simulation/Games for Learning. 16: 155–172.
  15. ^ Mitchell, J.T. & Everly, G.S. (1993). Critical incident stress debriefing: An operations manual for the prevention of traumatic stress among emergency services and disaster workers. Ellicott City, MD: Chevron Publishing.
  16. ^ Dyregrov A (1989). "Caring for helpers in disaster situations: Psychological debriefing". Disaster Management. 2: 25–30.
  17. ^ a b c d e f g Lederman L. C. (1992). "Debriefing: Toward a systematic assessment of theory and practice". Simulation & Gaming. 23 (2): 145–160. doi:10.1177/1046878192232003. S2CID 145781387.
  18. ^ a b c d e f g h Sawyer T. L., Deering S. (2013). "Adaptation of the US Army's after-action review for simulation debriefing in healthcare". Simulation in Healthcare. 8 (6): 388–397. doi:10.1097/sih.0b013e31829ac85c. PMID 24096913. S2CID 35341227.
  19. ^ Dieckmann, Peter; Molin Friis, Susanne; Lippert, Anne; Østergaard, Doris (2009). "The art and science of debriefing in simulation: Ideal and practice". Medical Teacher. 31 (7): e287–e294. doi:10.1080/01421590902866218. PMID 19811136. S2CID 33283560.
  20. ^ a b c d e f Cheng, Adam; Eppich, Walter; Grant, Vincent; Sherbino, Jonathan; Zendejas, Benjamin; Cook, David A. (2014). "Debriefing for technology-enhanced simulation: A systematic review and meta-analysis". Medical Education. 48 (7): 657–666. doi:10.1111/medu.12432. PMID 24909527. S2CID 25555524.
  21. ^ a b c Steinwachs B (1992). "How to facilitate a debriefing". Simulation & Gaming. 23 (2): 186–195. doi:10.1177/1046878192232006. S2CID 60804084.
  22. ^ Lateef F (2010). "Simulation-based learning: Just like the real thing". Journal of Emergencies, Trauma, and Shock. 3 (4): 348–352. doi:10.4103/0974-2700.70743. PMC 2966567. PMID 21063557.
  23. ^ Ziv A., Wolpe P.R., Small S.D., Glick S. (2003). "Simulation-Based Medical Education: An Ethical Imperative". Academic Medicine. 78 (8): 783–787. doi:10.1097/00001888-200308000-00006. PMID 12915366. S2CID 15579985.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  24. ^ a b c d Kolb A.Y., Kolb D.A. (2005). "Learning Styles and Learning Spaces: Enhancing Experiential Learning in Higher Education". Academy of Management Learning & Education. 4 (2): 193–212. doi:10.5465/amle.2005.17268566.
  25. ^ a b c d e f g h i j k l m n o Sawyer, Taylor; Eppich, Walter; Brett-Fleegler, Marisa; Grant, Vincent; Cheng, Adam (2016). "More Than One Way to Debrief". Simulation in Healthcare. 11 (3): 209–217. doi:10.1097/SIH.0000000000000148. PMID 27254527. S2CID 28994438.
  26. ^ a b c d e f g h i j Zigmont, J. J., Kappus, L. J., & Sudikoff, S. N. (2011, April). The 3D model of debriefing: defusing, discovering, and deepening. Seminars in perinatology, 35(2), 52–58. WB Saunders.
  27. ^ Dewey, J. (1938). Education and experience. New York: Simon & Schuster.
  28. ^ a b c Mayer R. E. (2004). "Should there be a three-strikes rule against pure discovery learning?". American Psychologist. 59 (1): 14–19. CiteSeerX 10.1.1.372.2476. doi:10.1037/0003-066x.59.1.14. PMID 14736316.
  29. ^ Fosnot, C. T., & Perry, R. S. (1996). Constructivism: A psychological theory of learning. Constructivism: Theory, perspectives, and practice, 2, 8–33. New York & London: Teachers College Press, Columbia University.
  30. ^ a b c d e f g h i j Jaye, Peter; Thomas, Libby; Reedy, Gabriel (2015). "'The Diamond': A structure for simulation debrief". The Clinical Teacher. 12 (3): 171–175. doi:10.1111/tct.12300. PMC 4497353. PMID 26009951.
  31. ^ a b c d e f g h i j Rudolph, J., Simon, R., Dufresne, R. & Raemer, D. (2006.) There's no such thing as "nonjudgmental" debriefing: a theory and method for debriefing with good judgment. Simulation in Healthcare, 1(1), 49–55.
  32. ^ a b c d e Phrampus, P. E., & O'Donnell, J. M. (2013). Debriefing using a structured and supported approach. The comprehensive textbook of healthcare simulation, 73–84. Springer New York.
  33. ^ a b c d e f g h i j k l Eppich, Walter; Cheng, Adam (2015). "Promoting Excellence and Reflective Learning in Simulation (PEARLS)". Simulation in Healthcare. 10 (2): 106–115. doi:10.1097/SIH.0000000000000072. PMID 25710312. S2CID 11105878.
  34. ^ a b c d e f g h i j k Kolbe, Michaela; Weiss, Mona; Grote, Gudela; Knauth, Axel; Dambach, Micha; Spahn, Donat R.; Grande, Bastian (2013). "TeamGAINS: A tool for structured debriefings for simulation-based team trainings". BMJ Quality & Safety. 22 (7): 541–553. doi:10.1136/bmjqs-2012-000917. PMID 23525093. S2CID 23546356.
  35. ^ a b c d e Rudolph J.W., Raemer D.B., Simon R. (2014). "Establishing a Safe Container for Learning in Simulation: The Role of Presimulation Briefing". Simulation in Healthcare. 9 (6): 339–349. doi:10.1097/sih.0000000000000047. PMID 25188485. S2CID 34486136.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  36. ^ Savoldelli, Georges L.; Naik, Viren N.; Hamstra, Stanley J.; Morgan, Pamela J. (2005). "Barriers to use of simulation-based education". Canadian Journal of Anesthesia. 52 (9): 944–950. doi:10.1007/BF03022056. PMID 16251560.
  37. ^ Rudolph, Jenny W.; Simon, Robert; Raemer, Daniel B.; Eppich, Walter J. (2008). "Debriefing as Formative Assessment: Closing Performance Gaps in Medical Education". Academic Emergency Medicine. 15 (11): 1010–1016. doi:10.1111/j.1553-2712.2008.00248.x. PMID 18945231.
  38. ^ a b c d Raemer, Daniel; Anderson, Mindi; Cheng, Adam; Fanning, Ruth; Nadkarni, Vinay; Savoldelli, Georges (2011). "Research Regarding Debriefing as Part of the Learning Process". Simulation in Healthcare. 6 (7): S52–S57. doi:10.1097/SIH.0b013e31822724d0. PMID 21817862. S2CID 25795265.
  39. ^ "Why simulation can be efficient: on the preconditions of efficient learning in complex technology based practices" (PDF). BioMedCentral. Retrieved 28 July 2014.
  40. ^ Kunkler, Kevin (2006). "The role of medical simulation: an overview". The International Journal of Medical Robotics and Computer Assisted Surgery. 2 (3): 203–210. doi:10.1002/rcs.101. PMID 17520633.
  41. ^ Institute of Medicine (US) Committee on Quality of Health Care in America; Kohn, L. T.; Corrigan, J. M.; Donaldson, M. S. (2000). To Err Is Human: Building a Safer Health System. Institute of Medicine (IOM). doi:10.17226/9728. ISBN 978-0-309-26174-6. PMID 25077248.
  42. ^ Starfield, MD, MPH, Barbara (July 26, 2000). "Is US Health Really the Best in the World?". jama.ama-assn.org.{{cite web}}: CS1 maint: multiple names: authors list (link)
  43. ^ "How Common Are Medical Mistakes". wrongdiagnosis.com. 2008. Retrieved November 30, 2008.
  44. ^ World Health Organization (2013). Transforming and scaling up health professionals' education and training: World Health Organization Guidelines 2013. World Health Organization. hdl:10665/93635. ISBN 9789241506502.
  45. ^ Dilaveri, CA; Szostek, JH; Wang, AT; Cook, DA (September 2013). "Simulation training for breast and pelvic physical examination: a systematic review and meta-analysis". BJOG: An International Journal of Obstetrics & Gynaecology. 120 (10): 1171–1182. doi:10.1111/1471-0528.12289. PMID 23750657.
  46. ^ Sadovnikova, Anna; Chuisano, Samantha A.; Ma, Kaoer; Grabowski, Aria; Stanley, Kate P.; Mitchell, Katrina B.; Eglash, Anne; Plott, Jeffrey S.; Zielinski, Ruth E.; Anderson, Olivia S. (17 February 2020). "Development and evaluation of a high-fidelity lactation simulation model for health professional breastfeeding education". International Breastfeeding Journal. 15 (1): 8. doi:10.1186/s13006-020-0254-5. PMC 7026968. PMID 32066477.