Published on in Vol 8, No 3 (2020): Jul-Sep

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/18633, first published .
Effect of Computer Debriefing on Acquisition and Retention of Learning After Screen-Based Simulation of Neonatal Resuscitation: Randomized Controlled Trial

Effect of Computer Debriefing on Acquisition and Retention of Learning After Screen-Based Simulation of Neonatal Resuscitation: Randomized Controlled Trial

Effect of Computer Debriefing on Acquisition and Retention of Learning After Screen-Based Simulation of Neonatal Resuscitation: Randomized Controlled Trial

Original Paper

1Ilumens Platform of Simulation in Healthcare, Université de Paris, Paris, France

2Department of Anesthesia and Intensive Care, American Memorial Hospital, Centre Hospitalier Universitaire de Reims, Reims, France

3Emergency Department, Lariboisière University Hospital, Paris, France

4Psychiatry Department, Monsouris Mutualiste Institute, Paris, France

5Laboratoire Adaptation Travail Individu, Université de Paris, Boulogne-Billancourt, France

6Department of Anesthesia and Intensive Care, Georges Pompidou European Hospital, Paris, France

Corresponding Author:

Daphne Michelet, MD

Ilumens Platform of Simulation in Healthcare

Université de Paris

45 rue des Saints Pères

Paris, 75006

France

Phone: 33 326832537

Email: daphnemichelet@gmail.com


Background: Debriefing is key in a simulation learning process.

Objective: This study focuses on the impact of computer debriefing on learning acquisition and retention after a screen-based simulation training on neonatal resuscitation designed for midwifery students.

Methods: Midwifery students participated in 2 screen-based simulation sessions, separated by 2 months, session 1 and session 2. They were randomized in 2 groups. Participants of the debriefing group underwent a computer debriefing focusing on technical skills and nontechnical skills at the end of each scenario, while the control group received no debriefing. In session 1, students participated in 2 scenarios of screen-based simulation on neonatal resuscitation. During session 2, the students participated in a third scenario. The 3 scenarios had an increasing level of difficulty, with the first representing the baseline level. Assessments included a knowledge questionnaire on neonatal resuscitation, a self-efficacy rating, and expert evaluation of technical skills as per the Neonatal Resuscitation Performance Evaluation (NRPE) score and of nontechnical skills as per the Anaesthetists’ Non-Technical Skills (ANTS) system. We compared the results of the groups using the Mann-Whitney U test.

Results: A total of 28 midwifery students participated in the study. The participants from the debriefing group reached higher ANTS scores than those from the control group during session 1 (13.25 vs 9; U=47.5; P=.02). Their scores remained higher, without statistical difference during session 2 (10 vs 7.75; P=.08). The debriefing group had higher self-efficacy ratings at session 2 (3 vs 2; U=52; P=.02). When comparing the knowledge questionnaires, the significant baseline difference (13 for debriefing group vs 14.5 for control group, P=.05) disappeared at the end of session 1 and in session 2. No difference was found for the assessment of technical skills between the groups or between sessions.

Conclusions: Computer debriefing seems to improve nontechnical skills, self-efficacy, and knowledge when compared to the absence of debriefing during a screen-based simulation. This study confirms the importance of debriefing after screen-based simulation.

Trial Registration: ClinicalTrials.gov NCT03844009; https://clinicaltrials.gov/ct2/show/NCT03844009

JMIR Serious Games 2020;8(3):e18633

doi:10.2196/18633

Keywords



Neonatal resuscitation requires training. Almost 10% of newborns and 80% of preterm newborns weighing less than 1500 g will undergo resuscitation at birth, and the quality of care provided during the first minute of life is directly linked to the outcome [1-3]. Theoretical knowledge from current guidelines [4] is essential to ensure optimal neonatal resuscitation. Several technical skills, such as bag-mask ventilation, endotracheal intubation, or umbilical catheter placement, and nontechnical skills, such as situation awareness, decision making, communication, and teamwork [4,5] are also required to ensure safety and efficacy.

Since 2011, the Neonatal Resuscitation Program (NRP) developed by the American Academy of Pediatrics includes simulation-based training. The implementation of NRP led to a decrease in neonatal and perinatal mortality [6]. Simulation training increases the trainees’ self-confidence [7], knowledge [2], and technical skills [8] and improves team behavior [9]. Simulation training has many advantages such as the possibility to practice procedures without any risk for the patient and for trainees to commit errors and learn from those errors, through the repetition of different scenarios [9].

In recent years, screen-based simulation has become increasingly prevalent. They show many advantages such as better affordability than high-fidelity simulation [10], transportable, and autonomous (no need for an instructor). Screen-based simulation appears to be a valid tool in simulation-based education for health professionals, ensuring the same learning efficacy than traditional learning methods [11,12]. Indeed, recently, the development of computer sciences allowed the creation of more realistic medical simulators to improve knowledge and acquire nontechnical skills, know-how, and technical gestures [12,13]. A screen-based simulator (NRP eSim) designed by Laerdal Medical in collaboration with the American Academy of Pediatrics is even included in the NRP program as 1 of the 6 educational components of the NRP 7th edition curriculum [14].

Debriefing is inseparable from simulation. It has been shown to improve professional practice and clinical skills [15-18]. Debriefing represents a discussion between 2 or more individuals during which, aspects of a performance are explored and analyzed with the aim of gaining insights that impact the quality of future clinical practice [19]. Various efficient debriefing methods exist: postsimulation debriefing, in-simulation debriefing, verbal instructor debriefing, video-assisted instructor debriefing, self-debriefing, and multimedia debriefing (a computer text presentation with audio voice-over and videos) [15,17]. For example, Boet et al [20] showed that a self-debriefing (formative self-assessment aiming to provide feedback, allowing students to reflect on their performance and subsequently improve their skills) was as effective as traditional debriefing by an instructor. As part of simulation-based education, screen-based simulation must provide debriefing. These simulators “can easily include tools and modules of various kinds to collect data transparently during play. The data can then be processed to provide material for feedback during play, as in-game debriefing, and also as part of the end-of-game debriefing” [21]. For example, after the NRP eSim training, students received automated feedback for self-reflection. This feedback highlighted good performances achieved during the experience, the procedures that needed to be improved, and the missed procedures. The feedback represents what we refer to as “computer debriefing,” often delivered after a screen-based simulation in order to stay in a virtual environment with no need of an instructor [22]. However, few evaluations of the impact of computer debriefing on acquisition and retention of learning exist.

Retention of learning has been studied extensively after different simulation training in health sciences [23] and neonatal resuscitation [2,24,25]. However, the mean retention time of learning after simulation training and the optimal time interval between two formations remain debated [26,27]. The role of debriefing on retention of learning was already highlighted in some high-fidelity simulation studies [15,17], but it has not been studied in the context of screen-based simulation.

The objective of this study is to evaluate the impact of a computer debriefing after a screen-based simulation session compared to no debriefing in a virtual environment with no instructor. Our endpoints are acquisition of knowledge and skills and their retention after 2 months. We hypothesized that the debriefing group would yield better scores in different evaluations (knowledge, technical skills, nontechnical skills, and self-efficacy) as compared to the control group.


This randomized controlled simulation study was performed from November 2018 to January 2019 at L’école de Sages-Femmes de Baudelocque, a midwifery school of the Université de Paris. It was approved by the CERAR (Comité Ethique sur la Recherche en Anesthésie Réanimation) (IRB 00010254-2017-008). All students signed an informed written consent. The study was registered at ClinicalTrials.gov (NCT03844009).

Participants

Volunteer participants were recruited from among fourth-year students of L’école de Sages-Femmes de Baudelocque in Paris. They all followed the same curriculum on neonatal resuscitation, corresponding to only 1 academic course. No sample size calculation was performed for this research; a convenience sample was used. We included all 28 volunteers of the fourth-year class of 35 students.

Screen-Based Simulation

The screen-based simulation—Périnatsims—was designed by Medusims. It features the virtual environment of a delivery room in 3D with a newborn installed on a neonatal resuscitation table (Figure 1). The simulation used a point-and-click interface with a first-person point of view. In this digital simulator, learners could either be midwife, anesthetist, or pediatrician, although all the participants of this study were midwifery students. Throughout the scenario, the learner can call a pediatrician for help. Many scenarios were available with different difficulty levels (eg, preterm birth, emergency cesarean under general anesthesia, and abruptio placentae).

Figure 1. Participant during a scenario on the left and screenshot of the interface and virtual environment of Périnatsims screen-based simulation on the right.
View this figure

Design

Each participant performed individually on a laptop during 2 screen-based simulation sessions: session 1 in November 2018 and session 2 in January 2019 (Figure 1). The study design is summarized in Figure 2.

Figure 2. Experiment design.
View this figure

Session 1 started with a knowledge questionnaire (Q1, baseline), followed by the briefing, consisting of a 15-minute tutorial to explain different possible actions in the screen-based simulation. The participant performed the first scenario (low difficulty level) considered as the baseline level for knowledge and skills. It was followed by a second scenario (medium difficulty level). At the end of session 1, each participant again filled out the knowledge questionnaire (Q2) and a demographic survey, including a self-efficacy question.

Session 2 was conducted after 2 months. The simulation started with the same knowledge questionnaire (Q3) and tutorial, followed by a third scenario (high difficulty level). At the end of session 2, a last knowledge questionnaire (Q4) and the self-efficacy question were administered.

All 3 scenarios were identical and in the same order of increasing difficulty for every participant. The potential exposure of each participant to a real case (or training) of neonatal resuscitation during the 2 months delay was controlled and monitored.

Participants were randomized in 2 groups: debriefing group and control group. At the end of each scenario, participants from the debriefing group accessed a computer debriefing on technical and nontechnical skills. Technical skills assessment, based on the recommendations of International Liaison Committee on Resuscitation (ILCOR), was presented with a color code: green (well-performed action), orange (partially-performed action), and red (absent or wrong action) (Figure 3).

Figure 3. Computer debriefing of technical skills.
View this figure

Debriefing of the nontechnical skills was a self-debriefing. Each nontechnical skill involved in the neonatal resuscitation [3-5,28] was explained in one sentence, and then the learner self-rated their proficiency on a scale of 1 to 5, as shown in Figure 4. In this example, the nontechnical skill of situational awareness is explained as, “Medical staff have to stay alert and focus on the resuscitation. Distractions must be avoided.” The following question is, “Do you think you had this behavior?”

Figure 4. Self-debriefing of nontechnical skills.
View this figure

Participants from the control group had no debriefing until the end of the second session and the completion of every questionnaire. The sessions were recorded using a camera with participants’ written consent.

Outcomes

Comparison of Knowledge Acquisition and Retention

Knowledge was assessed using validated questionnaires (25 questions with single or multiple choices) based on the ILCOR recommendations [29]: Q1 at the beginning of session 1, Q2 at the end of session 1, Q3 at the beginning of session 2, and Q4 at the end of session 2.

Comparison of Skills Acquisition and Retention

Two independent blinded raters (an anesthetist and a human factors expert specialized in health sciences) evaluated the technical and nontechnical skills retrospectively by analyzing the video recordings.

Technical skills were assessed by the Neonatal Resuscitation Performance Evaluation (NRPE) scoring system [30], with 20 points for each scenario (eg, checked the material, dried the newborn, and initiated mask ventilation). Nontechnical skills were assessed by the Anaesthetists’ Non-Technical Skills (ANTS) [31] observation system, including four categories: situation awareness, task management, team work, and decision making (eg, prioritizing, coordinating activities with team members, gathering information, and selecting options). ANTS is a validated tool used to assess nontechnical skills in various situations, ranging from emergencies for medical students [32] to neonatal resuscitation for midwives (with a specific modified ANTS version) [33]. ANTS scores were recorded as the overall category scores on a scale of 1-4 from poor performance to good performance and on the 16-point global score as per ANTS system. Interrater reliability calculations were performed for both evaluations, with a good agreement between the two raters (κ=0.66; P=.01).

Comparison of Self-Efficacy Evaluation

The self-efficacy question assessed midwives’ perception on their own performance: “How much are you confident in your capability to organize and execute a neonatal resuscitation?” using a 6-point Likert scale ranging from “not at all confident” (scored as 0) to “very confident” (scored as 5) [34]. It was assessed at the end of each session.

Statistical Analysis

Data are presented as median (IQR) for continuous data given the small sample size. Agreement between raters for the ANTS and NRPE scores was evaluated using percent agreement and corresponding Cohen kappa coefficient (inter-rater agreement). Comparisons between groups were performed using the Mann-Whitney U test for independent samples. All tests were two-tailed, and statistical significance was considered at P<.05. Statistical analyses were performed using SPSS 25.0 software (IBM Corp).


The study included 28 participants; 14 were randomly assigned to the control group and 14, to the debriefing group. The participants were fourth-year students of a 5-year curriculum of midwifery in France. A majority (27/28) were women. The median (IQR) age was 22 (21-22) years. Five participants had previously followed high-fidelity simulation training, one had followed screen-based simulation training, but none had followed any training on neonatal resuscitation. No participant had witnessed or participated in a real neonatal resuscitation in 2 months prior to the study or had received any training.

Comparison of Knowledge Acquisition and Retention

At baseline, the control group (median 14.5; IQR 12.5-16) had better results than the debriefing group (median 12.5; IQR 11-13.75) (P=.05). This difference disappeared over time. There is no difference between the groups during session 1 and session 2. Results are presented in Table 1.

Table 1. Comparison of the knowledge questionnaires of the control and debriefing groups.
Questionnaires of the groupsMedian (IQR)UP value
Q1 baseline56.05
 Debriefing group12.5 (11-13.75)

Control group14.5 (12.5-16)

Q2 at the end of session 166.15
 Debriefing group13 (12.25-14)

 Control group14 (12.25-16)
Q3 in session 282.47
 Debriefing group14.5 (13-16)

 Control group14 (13.25-15)
Q4 at the end of session 283.48
 Debriefing group14 (13.25-14.75)

 Control group14 (12.5-15.75)

Comparison of Skills Acquisition and Retention

Technical Skills Assessment Through the NRPE

No significant difference in the NRPE scores was observed during the experimentation (Table 2).

Table 2. Comparison of the nontechnical skills, technical skills, and self-efficacy evaluation between the debriefing and control groups.





BaselineSession 1Session 2


DebriefingControlUP valueDebriefingControlUP valueDebriefingControlUP value
ANTSa score (total=16 points), median (IQR)

8 (6.1-9.8)6.75 (5.6-7.9)78.5.3813.25 (11.1-14.4)9 (6.6-11.4)47.5.0210 (9.3-13.9)7.75 (6.5-12)60.5.08

Task management (total=4 points)2 (1.37-2.5)1.5 (1-2)70.183 (2.5-3.62)2 (1.5-3)43.102 (2-3.62)2 (1-3)78.34

Team work (total=4 points)2 (1.37-3)2 (1-2.5)80.393.75 (2.87-4)2.25 (1.87-3.62)54.044 (3-4)3.25 (1.87-4)65.5.11

Situation awareness (total=4 points)2 (1.5-2.5)2 (1.37-2.5)87.603.25 (2.87-3.62)2.25 (1.5-3.62)60.082.25 (1.87-3.62)1.5 (1-3)65.5.13

Decision making (total=4 points)1.5 (1-2.5)1.5 (1-2.5)94.853 (2.37-3.5)2.25 (1.37-3.5)61.082 (1.87-3.5)2 (1-3)70.19
NRPEb score (total=20 points), median (IQR)10 (7.3-12.3)10 (7.3-12.3)94.5.879.2 (7.7-13.5)10 (9.2-13.1)87.5.6210.5
(8-12.8)
10.5 (7.5-12)97.96
Self-efficacy (total=5 points), median (IQR)N/AN/AN/AN/A2 (1-2)2 (1-2)92.763 (2-3)2
(1-2)
52.02

aANTS: Anaesthetists’ Non-Technical Skills.

bNRPE: Neonatal Resuscitation Performance Evaluation.

Nontechnical Skills Assessment Through the ANTS

A significant difference was observed between the two groups for session 1 (U=47.5; P=.02) and remained higher in favor of the debriefing group during session 2 (U=60.5; P=.08), while no difference was found in the baseline evaluation (scenario 1). The results (including the subcategories analysis) are presented in Table 2.

Comparison of Self-Efficacy Evaluation

A significant difference was found between the groups for session 2, with an improved self-efficacy score for the debriefing group (Table 2).


Major Findings

This study highlights the benefit of a computer debriefing compared to no debriefing on nontechnical skills acquisition, self-efficacy, and knowledge after a screen-based simulation of neonatal resuscitation. Our hypothesis that the debriefing group would obtain better scores than the control group in the different evaluations is validated for knowledge, nontechnical skills, and self-efficacy.

The major interest of debriefing after a simulation session has already been extensively demonstrated. The review by Cheng et al [15], including 108 studies comparing debriefing and no debriefing, found positive effects of debriefing on every knowledge and skills outcomes. From debriefing comes a major part of the theoretical benefit of screen-based simulation for training, contributing to meaningful connections between the learning experience and the real world [21]. However, our study was the first to compare computer debriefing and no debriefing and to analyze their impact on knowledge, technical skills, and nontechnical skills after a screen-based simulation.

Concerning knowledge evaluation, participants of the control group had better baseline knowledge of neonatal resuscitation than the debriefing group. Our results showed an improvement in the debriefing group’s score from the baseline level. The differences between the groups disappeared at the end of sessions, reflecting a positive effect of debriefing.

Usually, personalized debriefing after screen-based simulation addresses only technical skills. Data collected from the simulation are given back in the form of an automated feedback at the end of the scenario [21]. We found no evolution for the technical skills in our study. However, the increasing difficulty of the scenarios was designed to minimize the repetition effect on performance, as repeating the same scenario increases the participants’ skills more than varying the scenarios [35]. This could mask the effect of the debriefing itself since the required technical skills evolved with scenarios.

In this study, we added a self-debriefing of nontechnical skills after the screen-based simulation of a neonatal resuscitation. In a review on screen-based simulation for medical education and surgical skills training, Graafland et al [36] highlighted the interest of a screen-based simulation to train nontechnical skills. Furthermore, in a review of debriefing techniques after nontechnical skills simulation training, performance seemed to improve equally with various methods of debriefing: skilled facilitator, novice instructor using a script, and self-led debrief or multimedia debriefing [37]. Our results confirm the possibility and benefit of a self-debriefing of nontechnical skills following a screen-based simulation to improve learning.

The second major finding of this study is the effect of computer debriefing on retention 2 months after the initial training. The debriefing group showed a better self-efficacy assessment than the control group. Their ANTS performance remained higher than that of the control group. The role of debriefing on retention of learning was already underlined in some studies [25,38]. Few studies assessed the retention of learning after screen-based simulation training. Their results were rather positive when evaluated up to 1 month after simulation [38] but less effective than traditional learning methods when evaluated after 6 months [39]. Our positives results are encouraging and emphasize the role of the debriefing in retention of learning even though further studies are needed to confirm a longer-term effect.

Limitations of the Study

First, this study compared the effect of a computer debriefing with the effect of the absence of debriefing. Our objective was to stay in a virtual environment without the need for an instructor. As debriefing is a major component of simulation training, participants from the control group had access to the complete debriefing at the end of session 2. Therefore, this study only assessed the efficacy of a computer debriefing but not the superiority over other debriefing methods.

Second, the timing of the debriefing was not standardized or assessed. Participants from the debriefing group had an unlimited amount of time to consult the debriefing. This was not the case for the control group. Perhaps, a free time period should have also been proposed to the control group to offer the possibility for a spontaneous reflective process.

Third, the nontechnical skills assessment was performed with the ANTS scoring tool, which was not originally developed and validated for the studied population. The lack of published data on the use of the ANTS scores for midwives is a limitation.

Conclusion

Computer debriefing seems to improve nontechnical skills and self-efficacy estimation when compared to the absence of debriefing during a screen-based simulation. It also allows a progression of learner’s knowledge. This study supports the benefit of debriefing (including a computer debriefing) in screen-based simulation.

Acknowledgments

Authors would like to thank L’école de Sages-Femmes de Baudelocque, Université de Paris and especially, Ms Christele Vérot and all the students who participated in the study.

Authors' Contributions

DM participated in study design, student recruitment, simulation sessions, a posteriori analysis of the simulation sessions, data analysis and interpretation, drafting and revising the manuscript, and approved the final version. JB participated in study design, student recruitment, simulation sessions, a posteriori analysis of the simulation sessions, data analysis and interpretation, drafting and revising the manuscript, and approved the final version. JT participated in study design, a posteriori analysis of the simulation sessions, data interpretation, drafting and revising the manuscript, and approved the final version. MAP participated in a posteriori analysis of the simulation sessions and approved the final version. PC participated in data collection analysis and interpretation, drafting and revising the manuscript, and approved the final version. AT participated in data collection analysis and interpretation, drafting and revising the manuscript, and approved the final version.

Conflicts of Interest

None declared.

  1. Ryan C, Clark L, Malone A, Ahmed S. The effect of a structured neonatal resuscitation program on delivery room practices. Neonatal Netw 1999;18:25 [FREE Full text] [CrossRef] [Medline]
  2. Curran V, Aziz K, O?Young S, Bessell C. Evaluation of the effect of a computerized training simulator (ANAKIN) on the retention of neonatal resuscitation skills. Teach Learn Med 2004;16:157 [FREE Full text] [CrossRef] [Medline]
  3. Pearlman S, Zern S, Blackson T, Ciarlo J, Mackley A, Locke R. Use of neonatal simulation models to assess competency in bag-mask ventilation. J Perinatol 2016;36:242 [FREE Full text] [CrossRef] [Medline]
  4. Thomas E, Taggart B, Crandell S, Lasky R, Williams A, Love L. Teaching teamwork during the Neonatal Resuscitation Program: a randomized trial. J Perinatol 2007;27:a [FREE Full text] [CrossRef] [Medline]
  5. Thomas E, Williams A, Reichman E, Lasky R, Crandell S, Taggart W. Team training in the neonatal resuscitation program for interns: teamwork and quality of resuscitations. Pediatrics 2010;125:539 [FREE Full text] [CrossRef] [Medline]
  6. Patel A, Khatib M, Kurhe K, Bhargava S, Bang A. Impact of neonatal resuscitation trainings on neonatal and perinatal mortality: a systematic review and meta-analysis. BMJ Paediatr Open 2017;1:a [FREE Full text] [CrossRef] [Medline]
  7. Malmström B, Nohlert E, Ewald U, Widarsson M. Simulation-based team training improved the self-assessed ability of physicians, nurses and midwives to perform neonatal resuscitation. Acta Paediatr 1992:1273 [FREE Full text] [CrossRef] [Medline]
  8. Lee M, Brown L, Bender J, Machan J, Overly F. A medical simulation-based educational intervention for emergency medicine residents in neonatal resuscitation. Acad Emerg Med 2012 May;19(5):577-585 [FREE Full text] [CrossRef] [Medline]
  9. Mileder L, Urlesberger B, Szyld E, Roehr C, Schmölzer G. Simulation-based neonatal and infant resuscitation teaching: a systematic review of randomized controlled trials. Klin Padiatr 2014;226:a [FREE Full text] [CrossRef] [Medline]
  10. Haerling K. Cost-Utility Analysis of Virtual and Mannequin-Based Simulation. Simul Healthc 2018;13:33 [FREE Full text] [CrossRef] [Medline]
  11. Gentry S, Gauthier A, L'Estrade EB, Wortley D, Lilienthal A, Tudor CL. Serious Gaming and Gamification Education in Health Professionsystematic Review. J Med Internet Res 2019;21:e12994 [FREE Full text] [CrossRef] [Medline]
  12. Barré J, Michelet D, Job A, Truchot J, Cabon P, Delgoulet C. Does Repeated Exposure to Critical Situations in a Screen-Based Simulation Improve the Self-Assessment of Non-Technical Skills in Postpartum Hemorrhage Management? Simul Gaming 2019. Simulation and gaming 2019:1046878119827324 [FREE Full text] [CrossRef]
  13. Ma M, Jain L, Anderson P. Virtual, augmented reality and serious games for healthcare 1. In: vol. 1. Springer. Verlag Berlin Heidelberg: Springer; 2014.
  14. Sawyer T, Umoren R, Gray M. Neonatal resuscitation: advances in training and practice. Adv Med Educ Pract 2017;8:a [FREE Full text] [CrossRef] [Medline]
  15. Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook D. Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med Educ 2014;48:a [FREE Full text] [CrossRef] [Medline]
  16. Sawyer T, Eppich W, Brett-Fleegler M, Grant V, Cheng A. More Than One Way to Debrief: A Critical Review of Healthcare Simulation Debriefing Methods. Simul Healthc 2016;11:A [FREE Full text] [CrossRef] [Medline]
  17. Levett-Jones T, Lapkin S. A systematic review of the effectiveness of simulation debriefing in health professional education. Nurse Educ Today 2014;34:e58-e63 [FREE Full text] [CrossRef] [Medline]
  18. Kolbe M, Grande B, Spahn D. Briefing and debriefing during simulation-based training and beyond: Content, structure, attitude and setting. Best Pract Res Clin Anaesthesiol 2015;29:87 [FREE Full text] [CrossRef] [Medline]
  19. Chronister C, Brown D. Comparison of Simulation Debriefing Methods. Clin Simul Nurs 2012;8:e281 [FREE Full text] [CrossRef]
  20. Boet S, Bould M, Bruppacher H, Desjardins F, Chandra D, Naik V. Looking in the mirrorlf-debriefing versus instructor debriefing for simulated crises. Crit Care Med 2011;39:1377 [FREE Full text] [CrossRef] [Medline]
  21. Crookall D. Serious Games, Debriefing, and Simulation/Gaming as a Discipline. Simulation & Gaming 2011 Jan 06;41(6):898-920. [CrossRef]
  22. Pasquier P, Mérat S, Malgras B, Petit L, Queran X, Bay C, et al. A Serious Game for Massive Training and Assessment of French Soldiers Involved in Forward Combat Casualty Care (3D-SC1): Development and Deployment. JMIR Serious Games 2016 May 18;4(1):e5 [FREE Full text] [CrossRef] [Medline]
  23. Roy K, Miller M, Schmidt K, Sagy M. Pediatric residents experience a significant decline in their response capabilities to simulated life-threatening events as their training frequency in cardiopulmonary resuscitation decreases. Pediatr Crit Care Med 2011;12:e141-e144 [FREE Full text] [CrossRef] [Medline]
  24. Kaczorowski J, Levitt C, Hammond M, Outerbridge E, Grad R, Rothman A. Retention of neonatal resuscitation skills and knowledge: a randomized controlled trial. Fam Med 1998;30:11. [Medline]
  25. Kudreviciene A, Nadisauskiene R, Tameliene R, Tamelis A, Nedzelskiene I, Dobozinskas P. Initial neonatal resuscitationkill retention after the implementation of the novel 24/7 HybridLab® learning system. J Matern-Fetal Neonatal Med 2019;32:1230 [FREE Full text] [CrossRef] [Medline]
  26. Au K, Lam D, Garg N, Chau A, Dzwonek A, Walker B, et al. Improving skills retention after advanced structured resuscitation training: A systematic review of randomized controlled trials. Resuscitation 2019 May;138:284-296. [CrossRef] [Medline]
  27. Anderson R, Sebaldt A, Lin Y, Cheng A. Optimal training frequency for acquisition and retention of high-quality CPR skills: A randomized trial. Resuscitation 2019 Feb;135:153-161. [CrossRef] [Medline]
  28. Thomas EJ. Translating teamwork behaviours from aviation to healthcare: development of behavioural markers for neonatal resuscitation. Quality and Safety in Health Care 2004 Oct 01;13(suppl_1):i57-i64 [FREE Full text] [CrossRef] [Medline]
  29. Bélondrade P, Lefort H, Bertho K, Perrochon J, Jost D. Tourtier J-P. Anaesth Crit Care Pain Med 2016:17 [FREE Full text] [CrossRef] [Medline]
  30. van DHP, van TL, van DHM, van DLJ. Assessment of neonatal resuscitation skills: a reliable and valid scoring system. Resuscitation 2006;71:a [FREE Full text] [CrossRef] [Medline]
  31. Fletcher G, Flin R, McGeorge P, Glavin R, Maran N, Patey R. Anaesthetists? Non-Technical Skills (ANTS)valuation of a behavioural marker system. Br J Anaesth 2003;90:8. [CrossRef] [Medline]
  32. Hagemann V, Herbstreit F, Kehren C, Chittamadathil J, Wolfertz S, Dirkmann D. Does teaching non-technical skills to medical students improve those skills and simulated patient outcome? Int J Med Educ 2017;8?13. Int J Med Educ 2017:101 [FREE Full text] [CrossRef] [Medline]
  33. Cavicchiolo M, Cavallin F, Staffler A, Pizzol D, Matediana E, Wingi O. Decision Making and Situational Awareness in Neonatal Resuscitation in Low Resource Settings. Resuscitation 2018 [FREE Full text] [CrossRef] [Medline]
  34. Roh Y, Lee W, Chung H, Park Y. The effects of simulation-based resuscitation training on nurses? self-efficacy and satisfaction. Nurse Educ Today 2013;33:123 [FREE Full text] [CrossRef] [Medline]
  35. Tofil N, Peterson D, Wheeler J, Youngblood A, Zinkan J, Lara D. Repeated versus varied case selection in pediatric resident simulation. J Grad Med Educ?9 2014;6:275 [FREE Full text] [CrossRef] [Medline]
  36. Graafland M, Schraagen J, Schijven M. Systematic review of serious games for medical education and surgical skills training. Br J Surg 2012;99:1322 [FREE Full text] [CrossRef] [Medline]
  37. Garden AL, Le Fevre DM, Waddington HL, Weller JM. Debriefing after simulation-based non-technical skill training in healthcare: a systematic review of effective practice. Anaesth Intensive Care 2015 May;43(3):300-308 [FREE Full text] [CrossRef] [Medline]
  38. Rondon S, Sassi F, Furquim DAC. Computer game-based and traditional learning method: a comparison regarding students? knowledge retention. BMC Med Educ 2013;13:a [FREE Full text] [CrossRef] [Medline]
  39. Kanthan R. The impact of specially designed digital games-based learning in undergraduate pathology and medical education. Arch Pathol Lab Med 2011:135 [FREE Full text] [CrossRef] [Medline]


ANTS: Anaesthetists’ Non-Technical Skills
CERAR: Comité Ethique sur la Recherche en Anesthésie Réanimation
ILCOR: International Liaison Committee on Resuscitation
NRP: Neonatal Resuscitation Program
NRPE: Neonatal Resuscitation Performance Evaluation


Edited by G Eysenbach; submitted 09.03.20; peer-reviewed by N Chaniaud, S Jung; comments to author 29.03.20; revised version received 21.04.20; accepted 14.05.20; published 11.08.20

Copyright

©Daphne Michelet, Jessy Barre, Jennifer Truchot, Marie-Aude Piot, Philippe Cabon, Antoine Tesniere. Originally published in JMIR Serious Games (http://games.jmir.org), 11.08.2020.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Serious Games, is properly cited. The complete bibliographic information, a link to the original publication on http://games.jmir.org, as well as this copyright and license information must be included.