Accessibility settings

Published on in Vol 14 (2026)

This is a member publication of Bibsam Consortium

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/79163, first published .
Game Elements in Military Trauma Care Education: Systematic Review

Game Elements in Military Trauma Care Education: Systematic Review

Game Elements in Military Trauma Care Education: Systematic Review

Review

1Department of Learning, Informatics, Management and Ethics (LIME), Karolinska Institutet, Stockholm, Sweden

2Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Kraków, Poland

3Royal Centre for Defence Medicine, Birmingham, United Kingdom

4Institute of Naval Medicine, Alverstoke, Gosport, United Kingdom

5Department of Research, Education and Development and Innovation, Södersjukhuset, Stockholm, Sweden

6Faculty of Health and Social Sciences, Department of Health and Functioning, Western Norway University of Applied Sciences, Bergen, Norway

Corresponding Author:

Natalia Stathakarou, PhD

Department of Learning, Informatics, Management and Ethics (LIME)

Karolinska Institutet

Tomtebodavägen 18A

Stockholm, 171 77

Sweden

Phone: 46 707799671

Email: natalia.stathakarou@ki.se


Background: Game elements may inform the design of both simulations and games. However, evidence on how individual game elements inform the design of military trauma training simulations and their educational purpose remains limited.

Objective: This systematic review aimed to examine which game elements are used in the design of educational simulations for military trauma management, how they are implemented, for what purpose, and what outcomes are reported related to the game elements.

Methods: This is a systematic review conducted in accordance with PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. We included qualitative, quantitative, mixed methods, and design studies describing simulation-based training for military trauma management that incorporated game elements. Studies focusing solely on assessment, noninteractive interventions, or psychological trauma were excluded. Searches were conducted in Medline (Ovid), PubMed, IEEE Xplore, ERIC, Web of Science, ACM Digital Library, and CINAHL from inception to October 14, 2025, identifying 2487 records. Screening and data extraction were performed independently by 2 reviewers. Methodological quality was assessed using the Medical Education Research Study Quality Instrument (MERSQI) and the Côté and Turgeon grid. Results were synthesized using qualitative thematic synthesis.

Results: Forty-two studies published between 1986 and 2025 were included. Most studies were conducted in the United States and included a wide range of simulation modalities and learner populations. Sixteen game elements were identified, with narrative, sensation, imposed choice, time pressure, and scoring being most prevalent. The thematic synthesis identified multiple categories describing how these game elements were implemented. Justifications for the use of game elements were rarely provided; when present, they were primarily linked to realism, emotional engagement, adaptive learning, and feedback. Elements such as badges and competition were seldom used. No study explicitly linked individual game elements to specific educational outcomes. This review is constrained by heterogeneity across studies, an imperfect fit of quality appraisal tools for some study types, and the possibility of missed studies due to search vocabulary limitations.

Conclusions: This systematic review is innovative in providing the first comprehensive synthesis of how game elements are used in military trauma simulations. Unlike previous reviews, it explicitly focuses on the pedagogical purposes of these elements. It offers an overview of the prevalence of game elements in military trauma care education and synthesizes the pedagogical rationales for their use. The lack of studies explicitly linking individual game elements to learning outcomes highlights the need for more intentional research and transparent reporting. Future studies should treat gamification as a set of targeted design choices rather than as a single overarching strategy, and explore how its motivational dimensions can be effectively leveraged in military trauma training.

International Registered Report Identifier (IRRID): RR2-10.2196/45969

JMIR Serious Games 2026;14:e79163

doi:10.2196/79163

Keywords



Gamification is the use of game elements in an endeavor to nudge participants to perform certain actions by adopting a playful attitude [1], and it is a promising approach in health care education [2]. For example, learning activities might be designed to involve solving problems under time pressure, competing or collaborating with others, and earning points or badges [3]. Gamification has been linked to effects on motivation, behavior, and engagement, and learning [4]. In health care education, gamification and games have been shown to be at least as effective as other educational approaches, and in many studies, more effective for improving knowledge, skills, and satisfaction [2]. Gamification has the potential to improve learning outcomes, especially when it uses game elements that improve learning behaviors and attitudes towards learning [5]. Recent research indicates that gamification offers diverse and flexible strategies for enhancing clinical reasoning education across health care disciplines and settings [6]. In disaster education, gamification may enhance learners’ retention of knowledge, ability to cooperate, sense of presence, perceived realism, disaster awareness, decision-making ability, practical skills, and coping capacity [7].

Although gamification is known for its engaging and motivational benefits, its pedagogical value remains controversial [1,5]. This controversy arises because different gamification strategies may use various combinations of game elements, leading to diverse outcomes [8]. Interestingly, much of the literature treats gamification as a single approach solution, rather than acknowledging it as a collection of game elements that may vary considerably in purpose and the learning experiences that they offer.

Gamification and simulation are closely related concepts, but are often also contrasted; based on the spectrum of Qin et al [9], Ricciardi and De Paolis [10] conceptualize simulations and games as 2 extremes on a spectrum. At one end of this spectrum lie classical simulators, which are often designed for skills training and prioritize realism by replicating the real world. At the other end are games developed for fun and entertainment, often situated in entirely fictional or imaginary contexts. Between these extremes lie serious games and simulation games. Serious games are developed with nonentertainment purposes in mind, combining a high degree of realism with the entertainment elements of traditional games to facilitate skills development. In contrast, simulation games often blend imaginative or fictitious environments with simulation-based mechanics, offering engagement and the potential to support learning. Examples of serious games for military trauma training include the French Military Health Service’s 3D-SC1 game to train for and assess forward combat casualty care [11,12] and the US Army’s tactical combat casualty care simulation training program, TC3Sim [13,14].

Simulations may allow learners to experience complex situations and act as they would in a real environment. They may take several forms and provide learners with feedback. High-fidelity medical simulations are educationally effective, and simulation-based education complements medical education in patient care settings [15]. Simulation environments range from field exercises to virtual patients and virtual reality and can include a variety of different modalities in place of a human casualty, such as a manikin, a simulated patient, or a cadaver.

In military trauma care, different simulation technologies are used to train a range of technical and nontechnical competencies [16,17]. A recent scoping review [18] examined the use of simulations in military medicine and found that most studies focused on physical simulation modalities, such as manikins and task trainers, whereas only a limited number used augmented or virtual reality interventions. Simulation-based training enables the replication of austere and high-stress environments, providing a safe context for learners to practice trauma management, make decisions under pressure, and learn from errors without compromising patient safety [19]. Kubala and Warnick [20] found that knowing exactly what to expect in combat reduces fear and stress.

Military trauma care is characterized by austere environments, tactical demands, and limited medical resources that fundamentally distinguish it from civilian practice. In deployed settings, medical personnel often operate with minimal equipment and may provide care alone or under hostile conditions far from hospital facilities [21,22]. These circumstances require rapid decision-making, prioritization, and coordination under pressure, frequently with incomplete information, which increases the risk of preventable harm [23]. Military trauma also differs from civilian trauma in its organizational structures, triage systems, and treatment approaches, as well as in the nature of injuries: while civilian trauma commonly involves blunt injuries or low-velocity gunshot wounds, military injuries are often caused by blasts and high-velocity weapons [22-27]. Consequently, treatment protocols developed for civilian settings do not always translate effectively to the battlefield. Recent studies [28] emphasize the often-overlooked conditions of truly austere environments in research and education, indicating a need for trauma simulations to integrate realism and austerity. Yet, most military medical personnel receive training in civilian settings, which may not fully prepare them for managing trauma in hostile settings.

Game elements can be used in both simulations and games, serving different purposes [29]. For instance, a recent study [30] integrated game elements such as varying difficulty levels and scoring within the design of virtual patients to make them both more realistic and engaging as learning activities. Previous work analyzing game elements in education has led to frameworks and taxonomies supporting the design and evaluation of gamification in learning environments [3,31]. Extending this work, a list of game elements with the potential to support education and training in military trauma care was synthesized [30].

Incorporating game elements into simulation-based education in the field of military trauma training has the potential to increase motivation, enhance learners’ confidence, and support personalized learning [30]. Yet, literature about how individual game elements can inform the design of military trauma training simulations and influence learning outcomes is underexplored.

This systematic review aims to systematically examine the use of game elements in the design of military trauma training simulations in order to retrieve and synthesize international evidence on their design, educational purpose, and reported outcomes in trauma management training. By doing so, the systematic review seeks to inform simulation design practices and identify directions for future research. The research questions guiding this systematic review are:

  1. What game elements are used in the design of educational simulations in the context of military trauma management?
  2. How are the identified game elements used?
  3. What is the purpose of using game elements in the design of educational simulations in military trauma management?
  4. What outcomes are reported related to the game elements?

Protocol and Registration

A systematic review protocol for this study has been published in JMIR Research Protocols (PMID: 37682596) [32]. No changes were made to the planned synthesis methods after protocol publication. The results are reported in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines [33]. The PRISMA checklist, the PRISMA-S checklist, the PRISMA for Abstracts checklist, and SWiM (Synthesis Without Meta-analysis) reporting items are provided in Multimedia Appendices 1-4, respectively.

Inclusion and Exclusion Criteria

We included both qualitative and quantitative empirical and design studies that addressed different types of simulation interventions, which incorporated game elements. Game elements were identified using published gamification frameworks in a deductive-inductive manner [3,30,31]. We included studies that incorporated game elements designed to enhance the education and training of military trauma management. The included simulation interventions had a clear educational purpose within this context. In this systematic review, the included gamified simulations had some degree of interactivity, allowing the scenarios to unfold in response to the learners’ actions. Studies in which participants passively observed a scenario (eg, a video clip) were excluded. Studies using interventions primarily for patient education, rehabilitation, teleconference, treatment, and decision-support were considered outside the scope of this review. We excluded studies where the training only focused on individual body parts. We also excluded studies in which gamification was used solely for assessment, unless the assessment served an educational purpose. We included studies focusing on military trauma management and excluded those addressing psychological trauma, because it belongs to a different educational context and learner group. We included studies that had a link to military medicine, even if the learner population that received the simulation was linked to a civilian context. Game elements only taking place outside the simulation, such as scoring conducted by instructors who observed or assessed participants after a simulation session, were also excluded. Several simulations incorporated teams to develop team-relevant competencies, and a “team” is recognized as a game element in the Maheu-Cadotte framework [3]. We only included studies that introduced teamwork with a gamification purpose and excluded studies where teamwork was present merely because it was inherently a part of the simulation, such as when practicing communication skills.

Search Methods for Identification of Studies

We explored the databases/search engines: Medline (Ovid), PubMed, IEEE Xplore, ERIC, Web of Science, ACM Digital Library, and CINAHL with the help of librarians at the University Library of Karolinska Institutet. We included all articles regardless of publication language. All articles published up to October 14, 2025, were retrieved. With the help of the librarians, we conducted a search of citations and references in CitationChaser. Each database was searched independently, and no multi-database search function was used. No limits or filters were applied. The search strategies did not undergo a formal peer review, although they were developed by 2 experienced librarians and iteratively tested across several rounds. No study registries were searched. No additional online or print sources were purposefully browsed (eg, tables of contents, conference proceedings, or websites) beyond the citation and reference searching described above. Search strategies were developed specifically for this review in collaboration with librarians and were not adapted or reused from previous literature reviews. The search strategy is included in Multimedia Appendix 5 [34].

Selection of Studies

We imported all identified references to the Rayyan open-source web system [35]. Full-text versions of the included abstracts were retrieved via the university library. We did not contact authors or other stakeholders to obtain additional studies or data. Two researchers independently assessed the identified studies based on the inclusion and exclusion criteria. Any disagreements were resolved through discussion between the 2 reviewers. If no agreement could be reached, a third researcher was consulted. The selection process was represented using a PRISMA flow diagram [33], including the actual number of studies included and excluded at each stage of screening.

Data Extraction and Management

The data extraction sheet and initial coding frame for identifying game elements and outcomes were predefined and published in the registered review protocol [32]. To ensure consistency and a shared understanding of the coding approach, all authors jointly piloted the extraction process on 3 studies during the protocol development stage. This pilot served as a coder calibration exercise to refine code definitions and decision rules.

Throughout the review, the first author was paired with each coauthor during data extraction and coding to maintain alignment and consistency in interpretation. Two researchers independently extracted and managed the data for each included study using the finalized structured form. Any discrepancies were discussed until consensus was reached, and a third author was consulted when necessary.

Data Analysis, Synthesis, and Reporting

A qualitative data analysis combined with thematic synthesis [36] was conducted to answer the research questions. Data synthesis was performed using structured Excel spreadsheets; no qualitative analysis software was used. The simulations described in the included studies were classified according to categories proposed in earlier literature [16,17]. Virtual patients were categorized following the classification framework introduced by Kononowicz et al [37]. To answer the first research question, the reviewers compared extracted data to ensure consistency in coding and interpretation of game elements across studies. The data extraction process was guided by a predefined list of game elements based on existing frameworks [3,30,31].

To address the second research question, we synthesized the data further by grouping realizations of each game element. This involved reviewing how each element was operationalized across studies and clustering similar patterns into subcategories. This synthesis allowed us to develop categories that describe the implementation of each game element. The process was iterative and collaborative, followed by frequent discussion between all authors to refine the categories, ensure consistency, resolve discrepancies, and reach a consensus.

To answer the third research question, a thematic synthesis approach was applied. Paired researchers worked independently to review each study and extract passages of text that provided a rationale, justification, or implied educational purpose for the use of specific game elements. These included both explicit explanations and implicit meanings inferred from the study context. Subsequently, we clustered similar meanings into broader themes that captured the underlying educational or experiential intentions behind the use of the identified game elements. This involved a process of thematic grouping that connected the extracted rationales with the corresponding game elements identified earlier. The mapping was conducted by 2 researchers and discussed by all authors. Themes were constructed inductively from the data. Each theme was then linked to one or more game elements.

To address the fourth research question, we collected information on the reported educational or performance outcomes of the simulations attributed to individual game elements as described in each study. Data extraction focused on outcomes related to knowledge, skills, and attitudes, specifically engagement, confidence, and secondary outcomes, like accessibility and cost-effectiveness. Although outcomes were recorded alongside the included studies, no formal synthesis was conducted in the located studies to systematically link specific game elements to particular outcomes. This was due to limitations in the way outcomes were reported across studies as a combined effect of all game elements, rather than separately for each game element, and hence the lack of explicit connections between individual game elements and measured results.

No standardized outcome metrics or effect-size transformations were applied. Given heterogeneous outcome reporting and the absence of comparable effect estimates, findings were synthesized using qualitative thematic synthesis, focusing on the implementation and pedagogical purpose of game elements. All included studies contributed to the synthesis; no studies were selected or excluded from the synthesis based on study design or quality assessment, which was used descriptively to support interpretation rather than to weight or filter findings.

Findings were presented using structured tables and narrative summaries. Tables reported key study characteristics (eg, study design, simulation type, learner population) and the presence and implementation of individual game elements; studies were organized descriptively rather than ordered by effect size or risk of bias, as no comparable effect estimates were available.

Quality Appraisal

The quality of the included studies was assessed using 2 established instruments. For quantitative and mixed methods studies, we applied the Medical Education Research Study Quality Instrument (MERSQI) [38]. For research that reported results obtained exclusively through qualitative methods, such as interviews or focus groups, we used the quality appraisal grid developed by Côté and Turgeon [39]. Studies focusing solely on the design and development of simulations were not included in these quality assessments as they fall outside the intended scope of both MERSQI and the Côté and Turgeon grid [39]. Methodological quality was appraised descriptively to support interpretation of the evidence base. Certainty of the synthesis findings was not formally assessed using a grading framework.

Ethical Considerations

This systematic review does not involve processing of sensitive personal data and therefore ethical approval is not required according to the Swedish Ethical Review Act.


Included Studies

In total, 42 studies were included. Figure 1 presents the PRISMA flowchart showcasing the search and inclusion process. If an article was excluded for multiple reasons, only the first applicable exclusion criterion, as defined in the published review protocol [32], was recorded.

Figure 1. PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) diagram.

Article and Study Characteristics

The studies covered a variety of simulation types used for training purposes, including nondigital games, virtual patients, virtual patient games, live simulations, standardized patients, mannequins, and virtual and mixed realities. Most of the studies were conducted in the United States, followed by studies originating from Europe. The learner populations involved were diverse, ranging from hospital decision-makers, surgeons, soldiers, and combat medics to medical students and emergency response teams. Table 1 provides information about the general characteristics of the studies included, specifically the type of simulation, the country in which the study was conducted, and the learner population.

Table 1. Included studies, type of simulation, and learner population.
Lead author and yearSimulation typeLearner populationCountry
Achatz et al (2020) [40]Nondigital game/board gameHospital decision makersGermany
Arora et al (2014) [41]Human Standardized PatientSurgeons, anesthesiologists, physicians, and nursesUnited Kingdom
Badler et al (1996) [42]Virtual patient gameCombat MedicsUnited States
Baird et al (2020) [43]Virtual patient - High Fidelity Software SimulationThermal injury treatment providersUnited States
Beaven et al (2021) [44]Cadaver and High Fidelity ManikinsNational Guard members, doctors, nurses, physicians, operating department practitioners, medical studentsUnited Kingdom
Brown et al (2016) [45]Virtual and mixed realitiesCombat MedicsUnited States
Chi et al (1996) [46]Virtual patient gameCombat MedicsUnited States
Chi et al (1997) [47]Virtual patient gameCombat MedicsUnited States
Cohen et al (2013) [48]Virtual patient gameClinicians practicing trauma leadership; clinical major incident coordinator/silver commander.United Kingdom
Couperus et al (2019) [49]Virtual and mixed realitiesEmergency military physiciansUnited States
Couperus et al (2020) [50]Virtual and mixed realitiesEmergency military physiciansUnited States
DeFalco et al (2017) [13]Virtual patient gameCombat medicsUnited States
de Lesquen et al (2022) [51]Virtual patient gameEmergency doctorFrance
de Lesquen et al (2023) [52]Virtual patient gamePrehospital physiciansFrance
Du et al (2022) [53]Virtual and mixed realitiesMilitary medical studentsChina
Freeman et al (2001) [54]Virtual and mixed realitiesNavy medical providers (paramedics)United States
Goolsby et al (2014) [55]Immersive virtual environmentMilitary medical studentsUnited States
Hemman (2005) [56]Virtual patient - High fidelity software simulationCombat medicsUnited States
Henderson et al (1986) [57]Virtual patient - High fidelity software simulationMedical studentsUnited States
Henderson et al (2020) [58]Virtual patient gameCombat medicsUnited States
Kyle et al (2004) [59]High fidelity Manikins and Human standardized patientPhysicians, nurses, paramedics, professional scientists, military officers, lawyers, career politicians, consultants from nongovernmental organizations, administrators, intelligence officers, and logistic personnel.United States
Lombardo et al (2022) [60]Virtual and mixed realitiesEmergency medicine residents, attendings, medical students, physician assistants, army medics, and nursesUnited States
Lu et al (2023) [61]Virtual and mixed realitiesCombat medics and military surgeonsChina
Lennquist Montán et al (2014) [62]A card game in a live exercisePhysicians, nurses, ambulance/paramedics, Military staff, administrators, also collaborating agencies (rescue services, the police)Sweden
Netzer et al (2015) [63]High fidelity manikinsNavy Emergency Medical Teams, military physiciansIsrael
Pasquier et al (2016) [11]Virtual patient gameSoldiers, combat medicsFrance
Planchon et al (2018) [12]Virtual patient gameSoldiers, combat medicsFrance
Qin et al (2024) [64]Virtual and mixed realitiesCombat medics, nursesIsrael
Rabotin et al (2023) [65]Virtual and mixed realitiesParamedics, physiciansIsrael
Satava and Jones (1996) [66]Virtual and mixed realities and wearablesCombat medicsUnited States
Sonesson et al (2023) [67]Interactive patient scenariosMilitary trauma teamsSweden
Sotomayor (2008) [68]Virtual patient gameCombat medicsUnited States
Sotomayor (2010) [14]Virtual patient gameCombat medicsUnited States
Stansfield et al (1998) [69]Virtual and mixed realitiesCombat medicsUnited States
Stathakarou et al (2024) [30]Interactive patient scenariosCombat medicsSweden
Stone (2005) [70]Virtual patient gameMilitary trauma surgeonsUnited Kingdom
Stone (2011) [71]Virtual patient gameMilitary trauma surgeonsUnited Kingdom
Stone et al (2017) [72]Virtual and mixed realitiesMedical Emergency Response Teams (MERTs)United Kingdom
Tretyak et al (2025) [73]Virtual and mixed realities

Medical personnel and trainees involved in tactical emergency or combat casualty careAustria
Wier et al (2017) [74]Immersive virtual environmentMedical Emergency Response Teams (MERTs: physicians, nurses, medics)United States
Willy et al (1998) [75]Virtual patient - High fidelity software simulationMilitary physiciansGermany
Zhu et al (2024) [76]Virtual patient gameMobile medical logistics teams: background in medicine, nursing, logistics China

Game Elements Identified in the Design of Military Trauma Simulations

Figure 2 presents a synthesis of the 16 game elements identified in the design of military trauma simulations across the included studies. Multimedia Appendix 6 [11-14,30,40-76] provides a detailed categorization of the game elements identified in each of the 42 studies. The definitions of the game elements were derived from previous literature [3,30,31]; narrative refers to the structured sequence of events and decisions shaping the learner’s experience, while sensation captures the use of visual or auditory stimuli to enhance immersion. Imposed choice describes situations in which learners must select one option to progress, whereas time pressure requires actions or decisions within urgency or a limited timeframe. Scoring provides quantitative feedback with points, and hints refer to clues that support learners without revealing the correct answer outright. Challenge encompasses mechanisms designed to test abilities, and difficulty adaptation refers to the dynamic complexity of tasks and adjusting. Avatars function as digital representations of learners or patients, and randomness introduces unpredictable aspects in the scenario. Performance tables present detailed summaries of accomplishments across tasks, and collaboration encourages learners to work together toward shared objectives. Content unlocking restricts access to new material until specific criteria are met, progression visualizes the learner’s development over time, competition supports comparisons with mechanisms such as leaderboards, and badges serve as symbolic markers of achievement for completing tasks. The definitions of the game elements are summarized in Multimedia Appendix 7.

Figure 2. Synthesis of game elements identified in the design of military trauma simulations.

Application of Game Elements in Military Trauma Simulations

Table 2 provides a thematic synthesis of how game elements were used in the included studies, outlining different categories of how game elements can be used to inform educational simulations in trauma care.

Table 2. Thematic synthesis of how the game elements were used in the included studies.
Game elementApplication of game elements in military trauma simulations
Narrative
  • An interactive scenario that unfolds on the basis of the learners’ decisions [11-14,30,40-76]
Sensation
  • High environmental fidelity by replicating battlefield conditions [41,44,59]
  • Virtual fidelity in the digital environment to depict austerity, such as realistic portrayals of injury or digitally recreated battlefield environments [11-14,30,42,43,45-54,57,60-62,64,66,68-71,73,75,76]
  • Mixed reality and interaction with objects in both physical and virtual environments [55,65,72,74]
Imposed choice
Time pressure
  • Scenarios incorporating explicit time constraints [11,40,70-72,74]
  • Tasks requiring immediate action to simulate urgency [30,48,50-52,55,59,73]
  • Display of visible timers or auditory cues such as ticking clocks to reinforce urgency [45,51,54,57,64,75,76]
  • Delayed decisions negatively affecting patient outcomes, potentially leading to patient death [13,48,50,62,69-71]
  • Timely decision-making contributing positively to performance metrics [42,65]
Scoring
  • Quantifying learner performance based on patient health outcomes, such as trauma scores or health points [40,42,46,47,62,69]
  • Final scoring mechanisms summarizing overall performance or decision quality at the end of the scenario [11-13,30,41,45,51,52,60,61,63,65,70,76]
  • Real-Time Scoring: Continuous assessment during gameplay [30,40,45,49,50,64]
  • Penalty-Based Scoring: Deducting points for incorrect decisions [65]
  • Competitive scoring formats allowing ranking or comparison between participants [57]
  • Scoring linked to resource management, rewarding efficient allocation [57]
Hints
  • Human instructor–guided feedback during simulation sessions [42,44,55,59]
  • Computerized real-time audio feedback [45]
  • Simulated patient cues through physiological responses, appearance, or facial expressions [13,42,48-50,64,66,69,70,75]
  • Computerized feedback via virtual patient interactions or dialogue boxes [30,56,57,68,76]
  • Virtual instructor or colleague intervening when poor decisions occur [11,57]
  • Feedback on inappropriate or excessive resource use [57]
  • Contextual hints through simulated live TV news updates in real time [59]
Challenge
  • Managing complications and unforeseen events [40,43,48,63]
  • Exposure to distracting sounds or visual elements [44,45,72,73]
  • Performing difficult triage decisions, prioritizing treatable patients over those unlikely to survive [61]
  • Decision-making under conflicting or incomplete information [40,59]
  • Identifying concealed or initially nonobvious injuries [30,66]
  • Providing care in austere or unfamiliar environments [30,41,42,44,63,66,69,72,73]
  • Encountering unwinnable cases with inevitable failure outcomes [13,30,58]
  • Experiencing learner death when safety precautions are neglected [30,68]
Difficulty adaptation
  • The “game master” can adjust the scenario’s difficulty based on participant performance or situational needs [40,62,63]
  • Learners are able to select the initial difficulty level before the scenario [11,45]
  • Gradually increasing or varying levels of challenge [11,13,30,45,50,51,54,57,58,67,75]
  • Dynamic difficulty in the scenario based on participants’ performance [42,63,65]
Avatar
  • Patient cards representing virtual patients [40]
  • Learner avatars enabling player interaction through a virtual self or character [42,43,48,54]
  • Customizable learner avatars [45]
  • Avatars mirroring learners’ physical movements in real time [46,47,66,69]
  • Patient avatars representing casualties within the simulation [51-53,73]
Randomness
  • Unexpected resource problems: missing equipment, availability of staff, hospital resources [40,62]
  • Scenario variability in incident type, number of casualties, environmental austerity, weather conditions, and patient injuries or physiology [42,43,45,50-52,62,64]
  • Identical treatments do not always produce the same outcomes [57]
Performance tables
  • User performance summaries comparing actions against predefined “gold standard” treatment procedures [45]
  • Event timing and adherence to trauma resuscitation protocols [65]
  • Interactive display of casualty vital signs and key events with a complete performance log [45]
  • Performance grids integrating scoring and structured debriefing [11,51,52,64,70,76]
Collaboration
  • Multi-avatar collaboration between human players within the same simulation environment [42,45]
  • Collaborative tasks designed to enhance understanding of different professional roles [45,53]
  • Collaboration with virtual or artificial intelligence–driven team members and avatars [50,53,57,60,72]
Content unlocking
  • Completion of required treatment steps to progress within the scenario [45]
  • Selection of specific choices or actions to advance gameplay [30,40]
  • Demonstration of proficiency or task mastery to unlock subsequent levels [11]
Progress
  • Sequential management and triage flow of patients throughout the scenario [40]
  • Grid-based visualization of patient health and status [51]
  • Rescue progress bar displaying ongoing operations and cumulative training score in real time [76]
Competition
  • Scores used to compare and rank learners, enabling performance competition among peers [11,57]
Badges
  • Bronze, silver, and gold medal graduation according to the scoring system, integrating time and actions delivered [11]

Purposes of Using Game Elements in the Design of Simulation in the Included Studies

While most of the studies did not attempt to attribute the use of game elements to a specific educational purpose, the following studies were identified that justified the use of specific game elements. Table 3 summarizes the 9 identified themes explaining why specific game elements were integrated into the simulation design. Supporting data excerpts from the included studies are provided in Multimedia Appendix 8 [11,13,30,40,41,43,45-47,52-54,57,60,62,63,65,69,73-75].

Table 3. Justification themes of using game elements in the included studies.
ThemeGame element
Realism and emotional engagementNarrative [30,41,43,45,51,57,63]; Sensation [11,30,41,43,45,46,51,57,63]; Time pressure [30,40,73]; Avatar [45,51]; Randomness [45,57]; Challenge [45,73]; Imposed choice [30,47,62,75]; Collaboration [53]
Adaptive learning and feedbackDifficulty adaptation [45,51,60,65]; Scoring [11,47,51,60]; Performance tables [11,45,51]
Affective learningChallenges [12]
Learner agencyImposed choice [12]
Challenge the learnersDifficulty adaptation [54]
Risk-free experiential learningNarrative and Sensation [54], Time pressure [69]
Motivation and engagementCompetition and Scoring [11]
Situational awarenessChallenge [69]
Emotional regulation and overcoming anxiety Challenge and Sensation [74]

Reported Outcomes Associated With Game Elements

None of the included studies correlated individual game elements with the reported outcomes. Instead, the effects on knowledge, skills, and attitudes were attributed to the impact of the simulations as a whole.

Quality Appraisal of Studies

We evaluated 22 studies with MERSQI with scores ranging from 5 to 15.5 and a median score of 9.5 (Table 4). The most common methodological limitations identified included reliance on a single cohort without comparison groups, the use of nonvalidated evaluation instruments, and a focus on outcomes such as satisfaction or basic knowledge and skills acquisition. Since our inclusion criteria did not restrict studies to pre-post designs, the MERSQI tool items were not applicable in all cases, which may have contributed to lower scores in some studies. We evaluated only 2 studies with the grid by Côté and Turgeon [39] that used qualitative data collection methods. We excluded 18 design and development studies from the quality appraisal due to their descriptive nature, which falls outside the scope of both MERSQI and the Luc Côté grid. The evaluation score of each study is briefly available in Tables 4 and 5. Design studies were not assessed with formal quality appraisal tools [11,42,43,45-47,49-51,54,57,58,66,69-72,75]. Table 5 includes the 2 studies that were appraised using the Côté and Turgeon grid [39].

Table 4. Quality appraisal for studies using the Medical Education Research Study Quality Instrument.
StudyStudy Design (max=3)Sampling (max=3)Data type (max=3)Validity (max=3)Data analysis (max=3)Outcomes (max=3)Score (max=18)
Achatz et al (2020) [40]11.510216.5
Arora et al (2014) [41]121131.59.5
Beaven et al (2021) [44]1.523131.512
Cohen et al (2013) [48]10.53231.511
DeFalco et al (2017) [13]311131.510.5
de Lesquen et al (2023) [52]223131.512.5
Du et al (2022) [53]321021.59.5
Goolsby et al (2014) [55]1.5210217.5
Hemman (2005) [56]1.521031.58.5
Kyle e al (2004) [59]1010215
Lombardo et al (2022) [60]10.510215.5
Lu et al (2023) [61]11.510216.5
Lennquist Montán et al (2014) [62]1310218
Netzer et al (2015) [63]1.523231.513
Planchon et al (2018) [12]323231.514.5
Qin et al (2024) [64]1.50.53031.59.5
Rabotin et al (2023) [65]10.510215.5
Sonesson et al (2023) [67]20.51021.57
Sotomayor (2008) [68]323231.514.5
Sotomayor (2010) [14]30.53131.512
Wier et al (2017) [74]223331.514.5
Zhu et al (2024) [76]323331.515.5
Table 5. Quality appraisal for studies using the Côté and Turgeon grid [39].
StudyIntroduction (max=2)Methods (max=5)Results (max=2)Discussion (max=2)Conclusion (max=1)Total (max=12)
Stathakarou et al (2024) [30]2522112
Tretyak et al (2025) [73]2422111

Principal Findings

In this systematic review, we investigated the use of game elements in simulations for military trauma management. The results provide insights into the common application of the game elements and summarize the justification for their use. However, no study explicitly linked individual game elements to specific learning or performance outcomes.

In health care education, gamification and games have been shown to be at least as effective as other educational approaches, and in many studies, more effective for improving knowledge, skills, and satisfaction [62]. Gamification has the potential to improve learning outcomes, especially when it uses game elements that improve learning behaviors and attitudes towards learning [63]. However, many of the studies included in previous reviews are of low quality, lack a sufficient focus on specific game elements [62], and have methodological limitations [5].

Although games and simulations have been used in military training for centuries [77], research on how individual game elements can inform the design of military trauma training simulations and influence learning outcomes is underexplored. The question of linking specific game elements to outcomes has been raised in gamification studies [8,78] but received little attention in military trauma contexts. Even if the link between game elements and learning outcomes were established, designers must account for the fact that individual elements can be implemented in multiple ways, leading to variation in results.

Clarification studies, which explore how underlying mechanisms account for the observed effects [79], are uncommon in gamified learning research [5] and are largely absent in military trauma contexts. Such studies could provide insights into the mechanisms involved in gamified learning, inform the design of gamified learning approaches [4,5], and contribute to the understanding of the relationship between design principles and learning outcomes, such as clinical reasoning [80].

As early as 1957, Caillois [81] described characteristics that distinguish games from other forms of activity, including fun, uncertainty, detachment from real life, nonproductivity, the presence of specific rules, and fictitious settings. Garris et al [82] proposed 6 dimensions that distinguish games from traditional simulations, including fantasy, mystery, sensory stimuli, rules and goals, controls, and challenge.

In this systematic review, narrative and sensation, as the most frequently used game elements in the reviewed simulations, contribute to the fictitious settings by fostering immersion and enabling environmental recreation. Narrative was frequently combined with sensation, the use of sensory stimuli, such as visual and auditory cues, to enhance immersion, often reflecting the austerity and constraints of military trauma contexts. In the application of digital technology, multimedia has been shown to enhance learning outcomes such as satisfaction, achievement, motivation, and attention [83]. In live simulation exercises, sensation was often conveyed by replicating battlefield conditions. This was achieved by conducting simulations outside the classroom or in rare environments, such as simulated deployed hospitals or ships, and by incorporating battlefield sounds and stress-inducing elements to recreate austerity [41,44,59].

Difficulty adaptation was identified in 16/42 included studies. The level of difficulty is a critical factor in learning; it should remain below the maximum capacity of learners’ knowledge to maintain motivation and facilitate effective learning. When difficulty exceeds learners’ capabilities, performance and engagement may decrease [84]. In the included studies, difficulty adaptation was implemented in various ways, for instance, in a classroom, with an instructor adjusting the level to the learners’ performance [40,62,63] or by allowing the learners to select different levels prior to the start of the game [11,45].

“Challenge” as a game element was identified in several studies. This differs from the level of difficulty, which corresponds to the level of knowledge and the skills the learners need to apply to perform the task, as challenge often involves dealing with complications and unforeseen circumstances [40,43,48,63]. Examples include the requirement to make difficult triage decisions (eg, prioritizing treatable patients over those unlikely to survive) [61], decision-making with conflicting information [40,59], distractors such as background noise and visual disturbances [44,45,72,73], or emotionally charged scenarios and unwinnable cases [13,30,58].

Simulations also frequently included imposed choice, which presented users with predefined options or decision menus. Additionally, scoring and time pressure were commonly used to provide feedback or assess learner performance and simulate the urgency of trauma care, respectively. Scoring was included in the total of 24/42 studies. In several cases, it was based on the performance of the decision-making of the learners, providing learners with performance-based feedback [11-13,30,41,45,51,52,60,61,63,65,70,76]. A recently published study [85] proposes that learners should be given enough opportunities to fail and try again to support serious gaming for disaster preparedness; however, without direct feedback after failure. While repetitive training enables learners to refine their problem-solving strategies over time, a principle in line with Ericsson et al’s [86] theory of deliberate practice, feedback combined with scoring may enable reflection in military medicine [30].

In total, 26/42 studies included time pressure, which was introduced both directly and indirectly within several simulations. For example, in some cases, time pressure was made explicit through the display of a ticking clock on the screen [45,51,54,57,64,75,76], whereas in others it was conveyed more implicitly by prompting learners to make rapid decisions under conditions where delays could negatively affect patient outcomes and performance scores.

While some of the game elements, such as narrative, sensation, and time-pressure that simulate contextual austerity of the military environment were often present, other types of game elements that are commonly associated with the playful and entertaining side of gamification were underrepresented. For example, badges appeared only in the form of medals in Pasquier et al [11]. Games have a long history in military medical training, yet the emphasis has perhaps traditionally been placed on their potential to mimic an austere environment, rather than on entertainment. It is possible that the playful dimensions of gamification were intentionally de-emphasized to maintain the seriousness of military trauma training. Alternatively, such elements may have been included in the simulations but not explicitly described in the study reports.

Competition was identified only in 2 studies [11,57], and in both cases, it was accompanied by scoring. Competition has been widely discussed in educational theory for its potential to enhance motivation and engagement. Malone and Lepper [87] emphasized competition as a mechanism that fosters intrinsic motivation by providing learners with clear goals and immediate feedback, often driving individuals to achieve better outcomes when compared with others. Van Eck and Dempsey [84] extended this idea, suggesting that competition may stimulate both intrinsic and extrinsic motivation: “For learners who are extrinsically motivated by social standing and recognition, competition against other individuals may serve to increase their efforts and perseverance in the instructional game to gain standing among their peers. Learners who are intrinsically motivated may likewise compete against their own score to see how much better they can do” [84]. In the same study, competition is linked to the concept of challenge.

Johnson and Johnson [88] broadened the concept by emphasizing collaborative competition, where individuals or teams compete in a way that supports group cohesion and shared learning objectives. This form of competition minimizes the potential negative effects, such as stress or discouragement. Collaboration was discussed in studies as training team skills for trauma training was the purpose of the intervention. In this review, however, we only included collaboration as a game element when it extended beyond the general goal of team training and was implemented in a distinct or designed manner. Examples included multi-avatar collaboration between human players within the same simulation environment [42,45], collaborative tasks designed to enhance understanding of different professional roles [45,53], or collaboration with virtual or artificial intelligence–driven team members and avatars [50,53,57,60,72].

Understanding the motivations behind the use of game elements is challenging, particularly in studies where design intentions are not made explicit and game elements are not formally acknowledged. In the studies where an explicit rationale was provided, linking game elements to justifications of their purpose resulted in the identification of 9 themes: realism and emotional engagement [11,30,40,41,43,45-47,51,53,57,62,63,73,75], adaptive learning and feedback [11,45,47,51,60,65], affective learning [12], learner agency [12], challenge the learners [54], risk-free experiential learning [54,69], motivation and engagement [11], situational awareness [69], emotional regulation and overcoming anxiety [74].

When examining the stated purpose behind the use of game elements, we found that elements such as narrative and sensation were most frequently justified in terms of their contribution to realism and immersion. In contrast, most other game elements were either not discussed at all or not explicitly linked to any specific educational function. Interestingly, even in studies that mentioned motivational aims in their background sections, game elements were rarely linked to motivation or engagement. For instance, the study by Achatz et al [40], which used a broad range of game elements mentioned in the background: “The didactic approach and the course structure ensure the interest, motivation, and progress of the target group, namely experienced clinical decision makers.” Despite this statement, looking into the use of game elements and their justification, time pressure was not presented as a motivational design choice but rather as a mechanism to simulate real-time conditions. We only identified one study [11] linking competition and scoring to motivation and engagement: “Furthermore, through the processes of scoring and gamification applied in 3D-SC1, the trainee is motivated to improve his personal experience. He also shares his scores with his peers in a competitive and engaging challenge”.

When examining the stated purpose of the simulation itself in a broader way rather than looking at the scope of specific individual game elements, we found that in the few studies explicitly referring to gamified simulations or serious games, the primary goals were often to attract and sustain learners’ interest, and to enhance motivation and engagement. For instance, Planchon et al [12] emphasized the motivational and entertaining aspects of serious games, noting: “SGs are similar to video games in that they are engaging, rewarding, and fun. However, at the same time, they can also be used to educate or train.” In contrast, studies that did not explicitly use the terms game or gamification tended to incorporate game elements to enhance realism and immersion, rather than playfulness or entertainment. These elements were often integrated to replicate the conditions and constraints of real-world environments. For example, Beaven et al [44] noted: “Our aim was to deliver a highly realistic, immersive simulation training experience, teaching both technical and nontechnical skills necessary for the management of war injuries in the austere environment of a far forward surgical facility. Recreating the physical and psychological work environment in a realistic way was desirable in order to encourage people to behave as they would in real life. By fostering this real-life behavior, the participants are better able to imagine the authentic scenario, and training becomes more immersive as a result.” In such cases, game elements appeared to function to enhance realism and authenticity rather than as mechanisms to directly foster learner motivation or engagement. Other studies, such as Sotomayor [14], noted that games: “appeal to the younger generation that has been exposed to their use since early age. Motivation is a big factor observed within the training audience”. However, even in such cases, the justification remained general and did not extend to a discussion of how specific game elements serve defined educational purposes and support motivation.

Realism in simulation encompasses physical, conceptual, and emotional fidelity, each contributing to the authenticity of the learning experience [89]. Advances in artificial intelligence have further expanded the potential for realism by enabling more dynamic and immersive environments [30,90]. On the other hand, some research points out a cognitive bias towards highly realistic and technically advanced learning tools; this review included 2 studies [70,71] which noted that while multimedia special effects may appeal to avid gamers, they can also distract serious users and impair performance. Effective multimedia materials require careful attention not only to managing cognitive load, but also to ensuring that the chosen media formats support, rather than hinder, learning [91].

None of the included studies explicitly linked individual game elements to reported learning outcomes. When the outcomes of the simulations were considered more broadly, most studies described positive educational effects. This observation aligns with findings from a recent systematic review on gamification in disaster education [92], which concluded that gamification could enhance competencies such as emergency response, decision-making, and teamwork in disaster nursing education, and can support learning engagement through game elements such as cooperation, competition, scoring, and scenario-based activities. However, that review also did not analyze the direct relationship between specific game elements and particular learning outcomes.

Design Implications

The findings of this systematic review suggest that game elements should be treated as targeted design choices that are selected to serve a clearly stated pedagogical intention, rather than added as generic “gamification.” A practical starting point is to define how a simulation is intended to be supported through game elements, and then to implement the corresponding elements as a coherent configuration. For simulations aiming to strengthen realism and emotional engagement in austere trauma contexts, narrative and sensation could be used to create a realistic and immersive experience, and then intentionally reinforced through time pressure, imposed choice, and selected forms of challenge, randomness, avatar use, or collaboration to recreate uncertainty, constraints, and team demands that characterize deployed care. These game elements were previously identified to contribute to realistic tactical experiences for civilian and military trauma care [30] and align with the game elements reported in Table 3.

When the intention is adaptive learning and feedback, limited evidence from this review suggests the use of game elements such as difficulty adaptation, scoring, and performance tables. In the studies included in this review, motivation and engagement were rarely justified at the game element level and, when they were, this was linked to competition and scoring [11]. However, when deciding on what game elements to use, educators and designers might want to consider that time pressure in simulation might not correspond to the time required in the field, and some of the elements, such as scoring and competition, might be perceived as misaligned with the learning goals [93]. Therefore, a good practice might be to present such elements as optional and configurable features [93].

Finally, because none of the included studies linked individual game elements to outcomes, educators, designers, and researchers are invited to consider the hypothesized mechanism associated with each chosen element and to align evaluation with those mechanisms. In practice, this means moving beyond whole-simulation outcome claims and using designs and measures that can test element-level contributions, thereby improving understanding of how specific elements shape decision-making and the learning experience.

Methodological Considerations and Limitations

This review was designed and conducted as a systematic literature review, following a published protocol [32] and transparent, reproducible procedures for searching, screening, extraction, synthesis, and quality appraisal [94]. The purpose was to retrieve international evidence of the impact of game elements in trauma management training and inform design practices and future research. Although the first research questions involved descriptive accounting of design features, which might be considered a hallmark of a scoping review [94,95], the subsequent thematic syntheses produced a structured and critically appraised summary of the use and educational justification for game elements in trauma management simulation reported in the literature. However, we acknowledge that the borderlines between different types of systematic reviews are often blurred in practice [94-96].

A striking finding was that none of the included studies correlated individual game elements with the reported outcomes. Instead, outcomes were generally attributed to the simulation as a whole. Consequently, evidence cannot be synthesized quantitatively on the level of specific game elements’ effectiveness. This should be interpreted as a finding about the state of the literature, rather than a limitation of the systematic review methodology.

When initiating this review, the original intention was to identify studies that explicitly addressed gamification in the context of military trauma training. Acknowledging that the literature on this topic may be sparse, the research questions were adapted to instead explore the use of game elements in the design of simulations in this context. This approach was based on the understanding that even in the absence of an explicitly acknowledged gamification strategy, game elements may still be purposefully or unintentionally embedded in the instructional design of simulations, potentially influencing the learning outcomes. As anticipated, direct comparisons between gamified and nongamified conditions or otherwise isolated reported effects of specific game elements were not observed.

We conducted a quality assessment of the included quantitative, qualitative, and mixed methods studies using 2 established tools: the MERSQI [38] and the Côté and Turgeon grid [39]. Design and development studies were not assessed using these instruments because they fall outside their intended scope. The appraisal was used descriptively to characterize methodological features and limitations across the included studies and to support interpretation of the evidence base, rather than to weight the thematic synthesis. This approach is consistent with review typologies describing how quality assessment may be used to mediate interpretation in heterogeneous evidence syntheses rather than determine theme inclusion or exclusion [96]. While some studies scored low on these metrics, this should not necessarily be interpreted as a reflection of poor study design. Rather, it may highlight an imperfect match between the quality appraisal tools and the purpose of some of the included studies.

The substantial heterogeneity across the included studies represents another limitation. The wide variation in study designs, simulation types, and learner populations poses significant challenges for synthesizing the data. Heterogeneity was examined descriptively by comparing study designs, simulation modalities, learner populations, and the implementation and stated purpose of game elements, rather than by analyzing heterogeneity of effect estimates. Additionally, although we collaborated with professional librarians to develop a comprehensive search strategy and conducted a reference and citation search, it is possible that we might have missed relevant studies since several of the game elements identified in this review were not part of the original search vocabulary.

Finally, because this review is based on secondary data, we were limited to what was reported by the study authors. In many cases, the motivations or intentions behind the use of game elements may have existed but were not documented. A more comprehensive understanding could have been achieved through interviews or direct engagement with simulation designers.

Conclusion

This is the first comprehensive synthesis of what and how game elements are applied within military trauma simulations, providing a structured evidence base for more intentional and theory-informed design of educational technologies used in high-stakes medical training. Unlike previous reviews, it explicitly focuses on the pedagogical purposes of these elements. It offers an overview of the prevalence of game elements in military trauma care education and synthesizes the pedagogical rationales for their use. While some elements, such as narrative, sensation, and time pressure, were often used in a way that mimics the austerity of the military trauma setting, game elements like badges and competition were underrepresented. Across the reviewed studies, game elements were typically not justified in terms of their pedagogical function. When justifications were provided, they were most often linked to environmental fidelity and immersion, followed by intentions to provide adaptive learning and feedback. None of the included studies in this review correlated individual game elements with the reported learning outcomes. These findings highlight the need for more intentional research on gamification design and transparent reporting, in which the educational purpose of each game element is clearly articulated. Future studies should treat gamification as a set of targeted design choices rather than as a single overarching strategy and further explore how its playful and motivational dimensions can be effectively leveraged in military trauma training to support motivation and learning.

Acknowledgments

The authors would like to acknowledge Narcisa Hannerz, Jonas Pettersson, and Anja Vikingson at Karolinska Institutet Biblioteket (university library) for their support in designing the search strategy, conducting the reference and citation searches, and retrieving the articles. The authors declare the use of generative artificial intelligence in the research and writing process. According to the GAIDeT (Generative AI Delegation Taxonomy) [97], the following tasks were assisted by generative artificial intelligence tools under full human supervision: proofreading and editing. The generative artificial intelligence tool used was GPT-4o. Responsibility for the final manuscript lies entirely with the authors. Generative artificial intelligence tools are not listed as authors and do not bear responsibility for the final outcomes.

Funding

This systematic review was financially supported by the Swedish Armed Forces. The funder was not involved in the study design, data collection, analysis, interpretation, or the writing of the manuscript.

Data Availability

The data sets generated during or analyzed during this systematic review are available from the corresponding author on reasonable request.

Authors' Contributions

NS designed the systematic review with guidance from AAK and KK. NS conducted the drafting of the manuscript, which was reviewed by all coauthors, who provided suggestions and edits. Screening, data extraction, and synthesis were carried out by all authors, with each author working in pairs together with NS.

Conflicts of Interest

None declared.

Multimedia Appendix 1

PRISMA 2020 checklist.

DOCX File , 273 KB

Multimedia Appendix 2

PRISMA-S checklist.

DOCX File , 17 KB

Multimedia Appendix 3

PRISMA 2020 abstract cheklist.

PDF File (Adobe PDF File), 122 KB

Multimedia Appendix 4

SWiM Checklist.

DOCX File , 21 KB

Multimedia Appendix 5

Search strategy.

PDF File (Adobe PDF File), 279 KB

Multimedia Appendix 6

Game elements per study.

PDF File (Adobe PDF File), 103 KB

Multimedia Appendix 7

Definition of game elements.

PDF File (Adobe PDF File), 39 KB

Multimedia Appendix 8

Justification of game elements.

PDF File (Adobe PDF File), 94 KB

  1. Pfeiffer A, Bezzina S, König N, Kriglstein S. Beyond classical gamification: in- and around-game gamification for education. 2020. Presented at: ECEL 19th European Conference on e-Learning; October 29-30, 2020:28-30; Virtual (Originally Berlin, Germany).
  2. Gentry SV, Gauthier A, L'Estrade Ehrstrom B, Wortley D, Lilienthal A, Tudor Car L, et al. Serious gaming and gamification education in health professions: systematic review. J Med Internet Res. 2019;21(3):e12994. [FREE Full text] [CrossRef] [Medline]
  3. Maheu-Cadotte M, Cossette S, Dubé V, Fontaine G, Mailhot T, Lavoie P, et al. Effectiveness of serious games and impact of design elements on engagement and educational outcomes in healthcare professionals and students: a systematic review and meta-analysis protocol. BMJ Open. 2018;8(3):e019871. [FREE Full text] [CrossRef] [Medline]
  4. Krath J, Schürmann L, von Korflesch HF. Revealing the theoretical basis of gamification: a systematic review and analysis of theory in research on gamification, serious games and game-based learning. Comput Hum Behav. 2021;125:106963. [CrossRef]
  5. van Gaalen AEJ, Brouwer J, Schönrock-Adema J, Bouwkamp-Timmer T, Jaarsma ADC, Georgiadis JR. Gamification of health professions education: a systematic review. Adv Health Sci Educ Theory Pract. 2021;26(2):683-711. [FREE Full text] [CrossRef] [Medline]
  6. Lee CY, Lee C-H, Lai H-Y, Chen P-J, Chen M-M, Yau S-Y. Emerging trends in gamification for clinical reasoning education: a scoping review. BMC Med Educ. 2025;25(1):435. [FREE Full text] [CrossRef] [Medline]
  7. Bai S, Zeng H, Zhong Q, Shen Y, Cao L, He M. Application of gamification teaching in disaster education: scoping review. JMIR Serious Games. 2024;12:e64939. [FREE Full text] [CrossRef] [Medline]
  8. Mazarakis A, Bräuer P. Gamification is working, but which one exactly? results from an experiment with four game design elements. Int J Hum Comput Interact. 2022;39(3):612-627. [CrossRef]
  9. Qin J, Chui Y-P, Pang W-M, Choi K-S, Heng P-A. Learning blood management in orthopedic surgery through gameplay. IEEE Comput Graph Appl. 2010;30(2):45-57. [CrossRef] [Medline]
  10. Ricciardi F, De Paolis LT. A comprehensive review of serious games in health professions. Int J Comput Games Technol. 2014;2014:1-11. [CrossRef]
  11. Pasquier P, Mérat S, Malgras B, Petit L, Queran X, Bay C, et al. A serious game for massive training and assessment of French soldiers involved in forward combat casualty care (3D-SC1): development and deployment. JMIR Serious Games. May 18, 2016;4(1):e5. [FREE Full text] [CrossRef] [Medline]
  12. Planchon J, Vacher A, Comblet J, Rabatel E, Darses F, Mignon A, et al. Serious game training improves performance in combat life-saving interventions. Injury. 2018;49(1):86-92. [CrossRef] [Medline]
  13. DeFalco JA, Rowe JP, Paquette L, Georgoulas-Sherry V, Brawner K, Mott BW, et al. Detecting and addressing frustration in a serious game for military training. Int J Artif Intell Educ. 2017;28(2):152-193. [CrossRef]
  14. Sotomayor TM. Teaching tactical combat casualty care using the TC3 sim game-based simulation: a study to measure training effectiveness. Stud Health Technol Inform. 2010;154:176-179. [Medline]
  15. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10-28. [CrossRef] [Medline]
  16. Quick JA. Simulation training in trauma. Mo Med. 2018;115(5):447-450. [FREE Full text] [Medline]
  17. Stathakarou N, Sonesson L, Lundberg L, Boffard KD, Kononowicz AA, Karlgren K. Teams managing civilian and military complex trauma: what are the competencies required in austere environments and the potential of simulation technology to address them? Health Informatics J. 2021;27(4):14604582211052253. [FREE Full text] [CrossRef] [Medline]
  18. Caffery SJ, Ferrari BD, Hackett MG. Military medical simulations-scoping review. Mil Med. 2025;190(3-4):e554-e560. [CrossRef] [Medline]
  19. Laporta AJ, Hoang T, Moloff A. From trauma in austere environments to combat or medical school: how blended hyper-realism in the real and virtual worlds can better prepare surgeons. Stud Health Technol Inform. 2014;196:233-237. [CrossRef]
  20. Kubala AL, Warnick WL. A Review of Selected Literature on Stresses Affecting Soldiers in Combat. Alexandria (VA). US Army Research Institute for the Behavioral and Social Sciences; 1979.
  21. Gerhardt RT, Mabry RL, De Lorenzo RA, Butler FK. Fundamentals of Combat Casualty Care. Texas. Borden Institute, US Army Medical Department; 2012.
  22. Leitch RA, Moses GR, Magee H. Simulation and the future of military medicine. Mil Med. 2002;167(4):350-354. [FREE Full text] [CrossRef]
  23. Gruen RL, Jurkovich GJ, McIntyre LK, Foy HM, Maier RV. Patterns of errors contributing to trauma mortality: lessons learned from 2,594 deaths. Ann Surg. 2006;244(3):371-380. [CrossRef] [Medline]
  24. Khorram-Manesh A, Lönroth H, Rotter P, Wilhelmsson M, Aremyr J, Berner A, et al. Non-medical aspects of civilian-military collaboration in management of major incidents. Eur J Trauma Emerg Surg. 2017;43(5):595-603. [FREE Full text] [CrossRef] [Medline]
  25. Khorram-Manesh A. Facilitators and constrainers of civilian-military collaboration: the Swedish perspectives. Eur J Trauma Emerg Surg. 2020;46(3):649-656. [FREE Full text] [CrossRef] [Medline]
  26. Haverkamp FJC, van Leest TAJ, Muhrbeck M, Hoencamp R, Wladis A, Tan ECTH. Self-perceived preparedness and training needs of healthcare personnel on humanitarian mission: a pre- and post-deployment survey. World J Emerg Surg. 2022;17(1):14. [FREE Full text] [CrossRef] [Medline]
  27. Givens M, Holcomb JB. Red line the red line: optimizing emergency medicine physicians and surgeons collaborative roles on trauma teams. J Trauma Acute Care Surg. 2024;97(2S Suppl 1):S27-S30. [CrossRef] [Medline]
  28. Anagnostou E, Michas A, Giannou C. Practicing military medicine in truly austere environments: what to expect, how to prepare, when to improvise. Mil Med. 2020;185(5-6):e656-e661. [CrossRef] [Medline]
  29. Ellaway RH. A conceptual framework of game-informed principles for health professions education. Adv Simul (Lond). 2016;1:28. [FREE Full text] [CrossRef] [Medline]
  30. Stathakarou N, Kononowicz AA, Mattsson E, Karlgren K. Gamification in the design of virtual patients for Swedish military medics to support trauma training: interaction analysis and semistructured interview study. JMIR Serious Games. 2024;12:e63390. [FREE Full text] [CrossRef] [Medline]
  31. Toda AM, Klock ACT, Oliveira W, Palomino PT, Rodrigues L, Shi L, et al. Analysing gamification elements in educational environments using an existing gamification taxonomy. Smart Learn Environ. 2019;6(1):1-17. [CrossRef]
  32. Stathakarou N, Kononowicz AA, Swain C, Karlgren K. Game elements in the design of simulations in military trauma management training: protocol for a systematic review. JMIR Res Protoc. 2023;12:e45969. [FREE Full text] [CrossRef] [Medline]
  33. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71. [FREE Full text] [CrossRef] [Medline]
  34. Bramer WM, Giustini D, de Jonge GB, Holland L, Bekhuis T. De-duplication of database search results for systematic reviews in EndNote. J Med Libr Assoc. Jul 2016;104(3):240-243. [FREE Full text] [CrossRef] [Medline]
  35. Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan-a web and mobile app for systematic reviews. Syst Rev. 2016;5(1):210. [FREE Full text] [CrossRef] [Medline]
  36. Thomas J, Harden A. Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Med Res Methodol. 2008;8:45. [FREE Full text] [CrossRef] [Medline]
  37. Kononowicz A, Zary N, Edelbring S, Corral J, Hege I. Virtual patients--what are we talking about? a framework to classify the meanings of the term in healthcare education. BMC Med Educ. 2015;15:11. [FREE Full text] [CrossRef] [Medline]
  38. Cook DA, Reed DA. Appraising the quality of medical education research methods: the medical education research study quality instrument and the newcastle-Ottawa scale-education. Acad Med. 2015;90(8):1067-1076. [CrossRef] [Medline]
  39. Côté L, Turgeon J. Appraising qualitative research articles in medicine and medical education. Med Teach. 2005;27(1):71-75. [CrossRef] [Medline]
  40. Achatz G, Friemert B, Trentzsch H, Hofmann M, Blätzinger M, Hinz-Bauer S, et al. Deployment‚ Disaster‚ Tactical Surgery Working Group of the German Trauma Society. Terror and disaster surgical care: training experienced trauma surgeons in decision making for a MASCAL situation with a tabletop simulation game. Eur J Trauma Emerg Surg. 2020;46(4):717-724. [CrossRef] [Medline]
  41. Arora S, Cox C, Davies S, Kassab E, Mahoney P, Sharma E, et al. Towards the next frontier for simulation-based training: full-hospital simulation across the entire patient pathway. Ann Surg. 2014;260(2):252-258. [CrossRef] [Medline]
  42. Badler NI, Clarke JR, Hollick M, Kokkevis E, Metaxas D, Bindiwanavale R. MediSim: simulated medical corpsmen and casualties for medical forces planning and training. 1995. Presented at: Proceedings of the National Forum: Military Telemedicine On-Line Today Research, Practice, and Opportunities; 1995 March 27-29:21-28; McLean, VA, USA. [CrossRef]
  43. Baird A, Serio-Melvin M, Hackett M, Clover M, McDaniel M, Rowland M, et al. BurnCare tablet trainer to enhance burn injury care and treatment. BMC Emerg Med. 2020;20(1):84. [FREE Full text] [CrossRef] [Medline]
  44. Beaven A, Griffin D, James H. Highly realistic cadaveric trauma simulation of the multiply injured battlefield casualty: an international, multidisciplinary exercise in far-forward surgical management. Injury. 2021;52(5):1183-1189. [CrossRef] [Medline]
  45. Brown R, McIlwain S, Willson B, Hackett M. Enhancing combat medic training with 3D virtual environments. IEEE; 2016. Presented at: IEEE International Conference on Serious Games and Applications for Health (SeGAH); 2016 May 11-13:1-7; Orlando, FL, USA. [CrossRef]
  46. Chi DM, Clarke JR, Webber BL, Badler NI. Casualty modeling for real-time medical training. Presence (Camb). 1996;5(4):359-366. [CrossRef] [Medline]
  47. Chi DM, Kokkevis E, Ogunyemi O, Bindiganavale R, Hollick MJ, Clarke JR. Simulated casualties and medics for emergency training. ScholarlyCommons. 1997. URL: https://repository.upenn.edu/hms/25 [accessed 2026-02-19]
  48. Cohen D, Sevdalis N, Taylor D, Kerr K, Heys M, Willett K, et al. Emergency preparedness in the 21st century: training and preparation modules in virtual environments. Resuscitation. 2013;84(1):78-84. [CrossRef] [Medline]
  49. Couperus K. Virtual reality trauma simulation: an immersive method to enhance medical personnel training and readiness. Ann Emerg Med. 2019;74(4):S30. [CrossRef]
  50. Couperus K, Young S, Walsh R, Kang C, Skinner C, Essendrop R, et al. Immersive virtual reality medical simulation: autonomous trauma training simulator. Cureus. 2020;12(5):e8062. [FREE Full text] [CrossRef] [Medline]
  51. de Lesquen H, de LVB, Cotte J, Avaro JP. Immersive virtual reality to learn triage: a serious game to be prepared to prehospital massive casualty management. JMIR Preprints. [CrossRef]
  52. de Lesquen H, Paris R, Fournier M, Cotte J, Vacher A, Schlienger D, et al. Toward a serious game to help future military doctors face mass casualty incidents. J Spec Oper Med. 2023;23(2):88-93. [CrossRef] [Medline]
  53. Du W, Zhong X, Jia Y, Jiang R, Yang H, Ye Z, et al. A novel scenario-based, mixed-reality platform for training nontechnical skills of battlefield first aid: prospective interventional study. JMIR Serious Games. 2022;10(4):e40727. [FREE Full text] [CrossRef] [Medline]
  54. Freeman KM, Thompson SF, Allely EB, Sobel AL, Stansfield SA, Pugh WM. A virtual reality patient simulation system for teaching emergency response skills to U.S. Navy medical providers. Prehosp Disaster Med. 2001;16(1):3-8. [CrossRef] [Medline]
  55. Goolsby C, Vest R, Goodwin T. New wide area virtual environment (WAVE) medical education. Mil Med. 2014;179(1):38-41. [CrossRef] [Medline]
  56. Hemman EA. Improving combat medic learning using a personal computer-based virtual training simulator. Mil Med. 2005;170(9):723-727. [CrossRef] [Medline]
  57. Henderson JV, Pruett RK, Galper AR, Copes WS. Interactive videodisc to teach combat trauma life support. J Med Syst. 1986;10(3):271-276. [CrossRef] [Medline]
  58. Henderson N, Rowe J, Paquette L, Baker R, Lester J. Improving affect detection in game-based learning with multimodal data fusion. Artif Intell Educ. 2020;12163:228-239. [CrossRef]
  59. Kyle RR, Via DK, Lowy RJ, Madsen JM, Marty AM, Mongan PD. A multidisciplinary approach to teach responses to weapons of mass destruction and terrorism using combined simulation modalities. J Clin Anesth. 2004;16(2):152-158. [CrossRef] [Medline]
  60. Lombardo R, Walther N, Young S, Gorbatkin C, Sletten Z, Kang C, et al. Ready medic one: a feasibility study of a semi-autonomous virtual reality trauma simulator. Front Virtual Real. 2022;2. [CrossRef]
  61. Lu J, Leng A, Zhou Y, Zhou W, Luo J, Chen X, et al. An innovative virtual reality training tool for the pre-hospital treatment of cranialmaxillofacial trauma. Comput Assist Surg (Abingdon). 2023;28(1):2189047. [FREE Full text] [CrossRef] [Medline]
  62. Lennquist Montán K, Hreckovski B, Dobson B, Örtenwall P, Montán C, Khorram-Manesh A, et al. Development and evaluation of a new simulation model for interactive training of the medical response to major incidents and disasters. Eur J Trauma Emerg Surg. 2014;40(4):429-443. [CrossRef] [Medline]
  63. Netzer I, Weiss A, Hoppenstein D. Naval casualty management training using human patient simulators. Disaster Mil Med. 2015;1:9. [FREE Full text] [CrossRef] [Medline]
  64. Qin L, You Z, Liu B, Luo C. Development of virtual reality training system for combat musculoskeletal trauma care. Simulation. 2024;101(1):3-11. [CrossRef]
  65. Rabotin A, Glick Y, Gelman R, Ketko I, Taran B, Fink N, et al. Practicing emergency medicine in the metaverse: a novel mixed reality casualty care training platform. Surg Innov. 2023;30(5):586-594. [CrossRef] [Medline]
  66. Satava RM, Jones SB. An integrated medical virtual reality program. the military application. IEEE Eng Med Biol Mag. 1996;15(2):94-97, 104. [CrossRef]
  67. Sonesson L, Boffard KD, Örtenwall P, Vekzsler P. Determining the educational impact of virtual patients on trauma team training during a multinational, large-scale civil military simulation exercise. J Trauma Acute Care Surg. 2023;95(2S Suppl 1):S99-S105. [FREE Full text] [CrossRef] [Medline]
  68. Sotomayor T. Evaluating Tactical Combat Casualty Care Training Treatments Effects on Combat Medic Trainees in Light of Select Human Descripti [Dissertation]. Orlando (FL). University of Central Florida; 2008.
  69. Stansfield S, Shawver D, Sobel A. MediSim: a prototype VR system for training medical first responders. IEEE; 1998. Presented at: Proceedings of IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180); 1998 March 14-18:198-205; Atlanta, GA, USA. [CrossRef]
  70. Stone RJ. Serious gaming - virtual reality’s saviour? 2005. Presented at: Proceedings of Virtual Systems and Multi-Media (VSMM) Conference; 2005 October 3-7:773-786; Ghent, Belgium.
  71. Stone RJ. The (human) science of medical virtual learning environments. Philos Trans R Soc Lond B Biol Sci. 2011;366(1562):276-285. [FREE Full text] [CrossRef] [Medline]
  72. Stone RJ, Guest R, Mahoney P, Lamb D, Gibson C. A 'mixed reality' simulator concept for future medical emergency response team training. J R Army Med Corps. 2017;163(4):280-287. [FREE Full text] [CrossRef] [Medline]
  73. Tretyak VE. TacMedVR: immersive VR training for tactical medicine—evaluating interaction and stress response. IEEE; 2025. Presented at: 11th International Conference on Virtual Reality (ICVR); 2025 July 09-11:345-350; Wageningen, Netherlands. [CrossRef]
  74. Wier GS, Tree R, Nusr R. Training effectiveness of a wide area virtual environment in medical simulation. Simul Healthc. 2017;12(1):28-40. [CrossRef] [Medline]
  75. Willy C, Sterk J, Schwarz W, Gerngross H. Computer-assisted training program for simulation of triage, resuscitation, and evacuation of casualties. Mil Med. 1998;163(4):234-238. [CrossRef]
  76. Zhu S, Li Z, Sun Y, Kong L, Yin M, Yong Q, et al. A serious game for enhancing rescue reasoning skills in tactical combat casualty care: development and deployment study. JMIR Form Res. 2024;8:e50817. [FREE Full text] [CrossRef] [Medline]
  77. Smith R. The long history of gaming in military training. Simul Gaming. 2009;41(1):6-19. [CrossRef]
  78. Wilson KA, Bedwell WL, Lazzara EH, Salas E, Burke CS, Estock JL, et al. Relationships between game attributes and learning outcomes. Simul Gaming. 2008;40(2):217-266. [CrossRef]
  79. Cook DA, Bordage G, Schmidt HG. Description, justification and clarification: a framework for classifying the purposes of research in medical education. Med Educ. 2008;42(2):128-133. [CrossRef] [Medline]
  80. Maheu-Cadotte M, Cossette S, Dubé V, Fontaine G, Deschênes M, Lapierre A, et al. Differentiating the design principles of virtual simulations and serious games to enhance nurses' clinical reasoning. Clin Simul Nurs. 2020;49:19-23. [CrossRef]
  81. Caillois R. Man, play and games. Urbana (IL). University of Illinois Press; 2001.
  82. Garris R, Ahlers R, Driskell JE. Games, motivation, and learning: a research and practice model. Simul Gaming. 2002;33(4):441-467. [CrossRef]
  83. Mayer RE, Dow GT, Mayer S. Multimedia learning in an interactive self-explaining environment: what works in the design of agent-based microworlds? J Educ Psychol. 2003;95(4):806-812. [CrossRef]
  84. Van Eck R, Dempsey J. The effect of competition and contextualized advisement on the transfer of mathematics skills a computer-based instructional simulation game. Educ Technol Res Dev. 2002;50(3):23-41. [CrossRef]
  85. Zhou T, Loh CS. The effects of fully and partially in-game guidance on players' declarative and procedural knowledge with a disaster preparedness serious game. In: Research Anthology on Game Design, Development, Usage, and Social Impact. New York. IGI Global; 2023:1818-1834.
  86. Ericsson KA, Krampe RT, Tesch-Römer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100(3):363-406. [CrossRef]
  87. Malone TW, Lepper MR. Making learning fun: a taxonomy of intrinsic motivations for learning. In: Snow RE, Farr MJ, editors. Aptitude, Learning, and Instruction. Vol 3: Conative and Affective Process Analyses. Hillsdale (NJ). Lawrence Erlbaum Associates; 1987:223-253.
  88. Johnson DW, Johnson RT. Learning Together and Alone: Cooperative, Competitive, and Individualistic Learning. Boston (MA). Allyn & Bacon; 1994.
  89. Choi W, Dyens O, Chan T, Schijven M, Lajoie S, Mancini ME, et al. Engagement and learning in simulation: recommendations of the simnovate engaged learning domain group. BMJ Simul Technol Enhanc Learn. 2017;3(Suppl 1):S23-S32. [CrossRef]
  90. Gray M, Baird A, Sawyer T, James J, DeBroux T, Bartlett M, et al. Increasing realism and variety of virtual patient dialogues for prenatal counseling education through a novel application of CHATGPT: exploratory observational study. JMIR Med Educ. 2024;10:e50705. [FREE Full text] [CrossRef] [Medline]
  91. Chang CC, Liang C, Chou PN, Lin GY. Is game-based learning better in flow experience and various types of cognitive load than non-game-based learning? perspective from multimedia and media richness. Comput Hum Behav. 2017;71:218-227. [CrossRef]
  92. Bai S, Zeng H, Zhong Q, Cao L, He M. Effectiveness of gamified teaching in disaster nursing education for health care workers: systematic review. J Med Internet Res. 2025;27:e74955. [FREE Full text] [CrossRef] [Medline]
  93. Stathakarou N, Kononowicz AA, Harjani M, Reshetukha D, Mattsson E, Karlgren K. Exploring the potential of gamified virtual patients for military trauma care training: a systematic text condensation analysis. Injury. 2026:113020. [FREE Full text] [CrossRef] [Medline]
  94. Smith SA, Duncan AA. Systematic and scoping reviews: a comparison and overview. Semin Vasc Surg. 2022;35(4):464-469. [CrossRef] [Medline]
  95. Munn Z, Peters MDJ, Stern C, Tufanaru C, McArthur A, Aromataris E. Systematic review or scoping review? guidance for authors when choosing between a systematic or scoping review approach. BMC Med Res Methodol. 2018;18(1):143. [FREE Full text] [CrossRef] [Medline]
  96. Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Info Libr J. 2009;26(2):91-108. [FREE Full text] [CrossRef] [Medline]
  97. Suchikova Y, Tsybuliak N, Teixeira da Silva JA, Nazarovets S. GAIDeT (generative AI delegation taxonomy): a taxonomy for humans to delegate tasks to generative artificial intelligence in scientific research and publishing. Account Res. 2025:1-27. [CrossRef] [Medline]


MERSQI: Medical Education Research Study Quality Instrument
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses
SWiM: Synthesis Without Meta-analysis


Edited by S Brini; submitted 16.Jun.2025; peer-reviewed by TT-O Kwok, L Zhu; comments to author 06.Oct.2025; accepted 14.Feb.2026; published 17.Mar.2026.

Copyright

©Natalia Stathakarou, Andrzej A Kononowicz, Maxine G Harjani, Cara Swain, Klas Karlgren. Originally published in JMIR Serious Games (https://games.jmir.org), 17.Mar.2026.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Serious Games, is properly cited. The complete bibliographic information, a link to the original publication on https://games.jmir.org, as well as this copyright and license information must be included.