Original Paper
Abstract
Background: The high cost and low availability of virtual reality simulators in surgical specialty training programs in low- and middle-income countries make it necessary to develop and obtain sources of validity for new models of low-cost portable simulators that enable ubiquitous learning of psychomotor skills in minimally invasive surgery.
Objective: The aim of this study was to obtain validity evidence for relationships to other variables, internal structure, and consequences of testing for the task scores of a new low-cost portable simulator mediated by gestures for learning basic psychomotor skills in minimally invasive surgery. This new simulator is called SIMISGEST-VR (Simulator of Minimally Invasive Surgery mediated by Gestures - Virtual Reality).
Methods: In this prospective observational validity study, the authors looked for multiple sources of evidence (known group construct validity, prior videogaming experience, internal structure, test-retest reliability, and consequences of testing) for the proposed SIMISGEST-VR tasks. Undergraduate students (n=100, reference group), surgical residents (n=20), and experts in minimally invasive surgery (n=28) took part in the study. After answering a demographic questionnaire and watching a video of the tasks to be performed, they individually repeated each task 10 times with each hand. The simulator provided concurrent, immediate, and terminal feedback and obtained the task metrics (time and score). From the reference group, 29 undergraduate students were randomly selected to perform the tasks 6 months later in order to determine test-retest reliability.
Results: Evidence from multiple sources, including strong intrarater reliability and internal consistency, considerable evidence for the hypothesized consequences of testing, and partial confirmation for relations to other variables, supports the validity of the scores and the metrics used to train and teach basic psychomotor skills for minimally invasive surgery via a new low-cost portable simulator that utilizes interaction technology mediated by gestures.
Conclusions: The results obtained provided multiple sources of evidence to validate SIMISGEST-VR tasks aimed at training novices with no prior experience and enabling them to learn basic psychomotor skills for minimally invasive surgery.
doi:10.2196/19723
Keywords
Introduction
Background
The advent of minimally invasive surgery in the mid-1980s [
] led to an increase in the number of iatrogenic bile duct injuries when many surgeons worldwide switched from open surgery to minimally invasive surgery without any prior training [ - ]. As a result, simulation has since become a valuable tool for learning motor skills for minimally invasive surgery. Many studies have demonstrated that simulation is a useful tool for learning motor skills for minimally invasive surgery and that learned skills can be transferred to the operating theatre [ - ].The first virtual reality (VR) simulator for minimally invasive surgery training was MIST-VR (Minimally Invasive Surgery Training - Virtual Reality) [
]. Evidence for construct validity was established in 1998 [ ], and evidence for predictive validity was obtained in 2002 [ , ]. Subsequently, evidence for concurrent validity was also demonstrated [ - ].Recent years have seen the development of low-cost gesture-based touchless devices that can interact with 3D virtual environments, among them Kinect (Microsoft Corp), the Leap Motion Controller (Leap Motion Inc), and the Myo armband (Thalmic Labs) [
].The Leap Motion Controller was launched in May 2012. It is based on the principle of infrared optical tracking, which detects the positions of fine objects such as fingertips or pen tips in a Cartesian plane; its interaction zone is an inverted cone of approximately 0.23 m3, and it has a motion detection range between 20 mm and 600 mm [
, ]. It measures 76 mm × 30 mm × 13 mm and weighs 45 g. It has 3 infrared emitters and 2 infrared cameras that capture the movements generated within the interaction zone [ , ]. The manufacturer reports an accuracy of 0.01 mm for fingertip detection, although one independent study showed an accuracy of 0.7 mm [ ]. Although the Leap Motion Controller is designed mainly to detect hand motions, it can track objects such as pencils and laparoscopic surgical forceps [ - ].The Leap Motion Controller has been used as a tool for the manipulation of medical images in the fields of interventional radiology and image-guided surgery or when there is a risk of contamination through contact (autopsy rooms, for example), for touchless control (operating theatre lights and tables) and for simulation (minimally invasive surgery and robotic surgery). Various authors have used the Leap Motion Controller to develop simulators that track hand or instrument movements [
, - ]. A paper by Lahanas [ ] describes using Leap Motion Controller to simulate 3 tasks: camera navigation, instrument navigation, and bimanual operation; 28 expert surgeons and 21 reference individuals took part in the study. The experts significantly outperformed novices in all assessment metrics for instrument navigation and bimanual operation.Simulators for learning skills for minimally invasive surgery can be classified into 3 types: traditional box trainers, augmented reality simulators (hybrid), and VR simulators [
, ]. Simulation has become a valuable tool for learning basic motor skills in surgery, but access to simulators remains problematic, especially in low- and middle-income countries, because of their high cost. Consequently, that makes it necessary to develop and validate the metrics and scores of low-cost portable simulators [ - ].The aim of this study was to evaluate a simulation instrument, SIMISGEST-VR (Simulator of Minimally Invasive Surgery mediated by Gestures - Virtual Reality), and to document the sources of validity evidence for task scores, relations to other variables, internal structure, consequences of testing, and response process.
Hypotheses
To that end, 3 hypotheses were formulated:
Hypothesis 1: Validity Evidence for Relations to Other Variables
The first hypothesis aims to demonstrate that the test scores discriminate between a reference group (no prior experience), surgical residents (less experienced), and surgeons (more experienced), showing that the experts already have the basic psychomotor skills being measured, and similarly, that videogaming experience is correlated with better performance in simulator tasks, regardless of the level of training and experience.
Hypothesis 2: Evidence for Internal Structure
The intrarater test-retest assumes that, if a reference individual is not exposed to simulators in the period of time between the 2 complete simulator exercises, there will be no significant differences in performance between the first and second exercises.
Hypothesis 3: Evidence for Consequences of Testing
Regarding evidence for consequences of testing, the reference group learning will be demonstrated by improvements in the metrics and the final score when comparing the first and the tenth attempt in each task.
Methods
Study Design
This was a prospective observational validity study. The current unified standard considers that all validity is construct validity and, as such, requires evidence from 5 sources [
- ].Content evidence includes a description of the steps taken to ensure that test content reflects, in a relevant way, the construct or characteristic being measured. The results obtained from the survey assessing fidelity to the criterion and content-related validity evidence for SIMISGEST-VR showed that all 30 participants felt that most aspects of the simulator were adequately realistic and that it could be used as a tool for teaching basic psychomotor skills in laparoscopic surgery (Likert score: range 4.07-4.73). The sources of content-related validity evidence showed that our simulator was a reliable training tool and that the tasks enabled learning of the basic psychomotor skills required in minimally invasive surgery (Likert score: range 4.28-4.67) [
].Evidence for relations to other variables refers to the statistical association between the test scores and other characteristics or external measures that have theoretical relations, such as level of training, level of experience, prior videogaming experience, and scores for other already validated instruments. One of the most common correlations is known group construct validity (ie, the correlation between performance scores and level of training and experience) [
]. Relations may be positive (convergent or predictive) or negative (divergent or discriminant) depending on the constructs being measured [ ]. This study explored the relations between performance scores and the level of training, experience, and prior videogaming experience.Evidence for internal structure includes data that evaluate the relations between the individual items of the assessment, and how they correlate to the construct. It includes measures of reliability, reproducibility, and factor analysis. Reliability is a necessary but insufficient condition for validity [
]. Intrarater reliability was obtained using the test-retest method, which assesses the stability of responses over time [ ]. Test-retest reliability was explored through blinded rerating after an interval of 6 months in the reference group. The randomly selected participants were asked whether they had had additional experience of using simulators during that period of time [ ]. The answer was “no” in all cases. The data produced by this second test were not taken into account in the evidence for the construct validity study. Worster and Haines [ ] noted that there was no published recommendation for the proportion of data that should be checked but that 10% was common. In this study, 29% of the reference individuals were included in the test-retest study. The demonstration of reliability is mandatory before an evaluation can be shown to be valid [ ].Evidence for consequences refers to the impact, benefit, or danger of assessment itself and the resulting decisions and actions. Yet, simply demonstrating consequences, even significant and impressive ones, does not constitute validity evidence unless investigators explicitly demonstrate that these consequences have an impact on score interpretation (validity) [
, ]. Evidence for consequences falls within a spectrum between high-stake examinations, licensing examinations, or low-stake examinations such a self-assessment used for formative feedback alone [ ]. In our case, we hoped to obtain evidence to demonstrate that the reference group had managed to achieve the learning curve.Evidence for response process includes theoretical and empirical analyses evaluating the extent to which the assessors’ and respondents’ responses are aligned to the construct. It includes an evaluation of safety, of quality control, and of the actors’ thoughts and actions during the assessment. The response process also includes the accuracy of data collection and entry into the database [
]. This type of evidence can be difficult to demonstrate because data are often qualitative [ ].Participants and Simulator Test Methodology
Participating in this study were minimally invasive surgery expert surgeons (n=28) in a range of surgical specialties, each who had performed more than 100 procedures, surgical residents (n=20) in a range of surgical specialties from the University of Caldas (in Manizales, Colombia), each who had performed fewer than 50 procedures (basic training: n=15; advanced training: n=5), and medical undergraduate students (n=100) from the University of Caldas and the University of Manizales who had no experience performing minimally invasive surgical procedures. The expert surgeons worked in the following specialties: general surgery 8 (28.5%), pediatric surgery 5 (17.8%), neurosurgery 4 (14.2%), colorectal surgery 3 (10.7%), orthopedic surgery 3 (10.7%), gynecological surgery 2 (7.1%), urological surgery 1 (3.5%), thoracic surgery 1 (3.5%), and vascular surgery 1 (3.5%).
All participants completed a questionnaire providing demographic data (
) and information about the dominant hand, level of training, levels of minimally invasive surgery skills, prior training with simulators, and experience with videogaming or VR devices.After the instructor had given basic instructions about using the simulator and had shown a video of each task to be performed, the study participants performed 10 repetitions of tasks 1, 2, 4, 5, and 6 with each hand. Task 3 was repeated 10 times because both hands were considered dominant. The instructor did not give additional feedback, but the simulator did provide concurrent feedback (visual and auditory feedback while performing each task), immediate feedback (displaying the results in terms of time, accuracy and errors at the end of each task), and terminal feedback (performance curve and final score). The participants were able to watch the demonstration videos again at any time. For the test-retest reliability study, 29 participants were randomly selected from the reference group. They repeated the entire exercise 6 months after the first exercise; none were exposed to any type of simulator during that period of time. One of the authors (FAL) supervised and photographically documented each exercise.
SIMISGEST-VR
SIMISGEST-VR was developed using design-based research [
- ]. A previously published article [ ] describes in detail the development of the device and a study assessing fidelity to the criterion and content-related validity evidence.Virtual Environment
The virtual environment consisted of the following modules: registration to collect users’ demographic data and a tutorial to show demonstration videos of the tasks to be performed.
SIMISGEST-VR supports 6 tasks, each of which corresponds to a surgical equivalent (
; ). The tasks were adapted from MIST-VR (Mentice Inc) [ , , ]. MIST-VR is the simulator on which the highest number of validation studies have been conducted, and they have demonstrated, on multiple occasions, that the skills that are learned can be transferred to the operating theatre [ , , - ].Except for task 3, all tasks had the option of configuring the dominant hand during the exercise; task 3 required the simultaneous use of both hands and therefore both played a dominant function. Given its level of difficulty, this task was performed last in all cases. The online virtual environment ran on Windows (Microsoft Inc) and MacOS (Apple Inc) platforms.
Task number | Task name | Description | Surgical equivalent | Learning objective |
Task 1 | Grip and placement | Take the sphere with one hand and move it to a new location within the workspace | Gripping and retraction of tissue to a given position, placement of clips and hemostasis, use of extractor bags | Visual-spatial perception and eye-hand coordination |
Task 2 | Transfer and placement of an object | Take the sphere, transfer it to another instrument and place it inside a hollow cylinder | Transfer of a needle between a clamp and a needle holder | Visual-spatial perception, eye-hand coordination, and use of both hands in a complementary manner |
Task 3 | Cross | Instruments travel along a surface in a 3D cylinder | Small intestine exploration | Coordinated use of both the dominant and nondominant hands and ambidexterity |
Task 4 | Removal and reinsertion of instruments | Removal of instruments from the operative site and reinsertion | One instrument stabilizes one organ while the other is removed from the field and reintroduced | Visual-spatial perception, use of both hands in a complementary manner, and depth perception |
Task 5 | Diathermy | Cauterize a series of targets located in a fixed sphere | Cauterize a bleeding blood vessel | Visual-spatial perception, time of diathermy, and accuracy of movements |
Task 6 | Target manipulation and diathermy | Take the sphere with the instrument and place it inside a virtual space represented by a cube and cauterize a series of targets with the other hand. | Present and set a target to cauterize | Visual-spatial perception, time of diathermy, and accuracy of movements |
Metrics
The metrics were established using 5 parameters: time (velocity), efficiency of movement for the right and left hands [
, ], economy of diathermy, error and accuracy (penalty) [ , ], and final score.Feedback
Feedback is essential [
]. Training on a simulator should have 3 purposes: to improve performance, to make performance consistent, and to reduce the number of errors [ ]. The haptic sensation and concurrent feedback were simulated using sound signals, color changes in the objects, and movement of the object when an undue collision occurred between the different components of the environment or when an error occurred during the task (concurrent feedback). For SIMISGEST-VR, we adopted 3 types of feedback: concurrent, which was provided while the task is being performed; immediate, which was provided at the end of each task when the system provides information on the presence or absence of errors, efficiency, and the time taken; and terminal, which was provided at the end of each training session when the system provides a series of graphs and tables that show performance over time [ - ].The data generated by the program were stored on an SQL (structured query language) database engine integrated into the simulation software.Hardware
Two laparoscopic forceps were used. In fact, we used simulated forceps made using 3D printers. These minimally invasive surgery forceps did not need to be functional. The final device with all its components assembled is shown in
. shows the fixing pad (1) for the Leap Motion Controller and the mounting support devices (3) for the minimally invasive surgery laparoscopic forceps (2), which allow simulation of the fulcrum effect; the Leap Motion Controller (4), responsible for detecting the movements of the instruments; and the computer, which, by means of the software programs running on it, administers the virtual environment and the metrics, and provides feedback and the final performance score on the screen (5) where the 3D virtual environment is displayed.To perform the test, a 13-inch MacBook Pro (Apple Inc) was used, which served as a screen, ran the 3D virtual environment, and stored metrics data.
Data Analysis
Continuous data are presented in a frequency distribution table by mean and standard deviation. The Shapiro-Wilk test was used to assess normality. Categorical data are also presented in a frequency distribution table. Since the metrics data were not normally distributed, nonparametric tests were used to assess the hypotheses. Regarding hypothesis 1, the differences in the scores and time taken to perform the first trial in each task between novices and experts were compared using the Wilcoxon signed-rank test. Among the novices, the final scores of the tenth trial in each task were compared by prior videogaming experience using the Kolmogorov-Smirnov test. To assess hypothesis 2, internal consistency was calculated using Cronbach α. In addition, test-retest reliability was assessed by comparing the tenth trial in each task performed initially and repeated 6 months later using the Spearman correlation coefficient. To assess hypothesis 3, the scores and time taken in the first and last trials in each task were compared by level of training using the Wilcoxon signed-rank test. In addition, excess diathermy in the first and last trials in tasks 5 and 6 was calculated by level of training using the Wilcoxon signed-rank test. P<.05 as level of statistical significance was established. Statistical analysis was performed using Stata (version 15.0; StataCorp LLC).
Results
Demographic Profile
Regarding prior experience with simulators, 35% (7/20) of the surgical residents and 36% (10/28) of the surgeons surveyed said they did not have any. Among the surgical residents, only 15% (3/20) had experience with VR simulators, and none had any experience with hybrid ones.
When videogaming experience was assessed, the low percentage of frequent gaming (daily or weekly) was striking: only 28% (28/100) in the reference group, 20% (4/20) among surgical residents, and 14% (4/28) among experts (
).Variable | Reference group (n=100) | Surgical residents (n=20) | Surgeons (n=28) | |
Gender, n (%) | ||||
Female | 47 (47) | 10 (50) | 4 (15) | |
Male | 53 (53) | 10 (50) | 24 (86) | |
Age | 23.5 (0.28) | 28.4 (0.54) | 47 (2.12) | |
Dominant hand, n (%) | ||||
Right | 89 (89) | 19 (95) | 27 (96) | |
Left | 11 (11) | 1 (5) | 1 (4) | |
Experience with simulators, n (%) | ||||
Yes | 1 (1) | 13 (65) | 18 (64) | |
No | 99 (99) | 7 (35) | 10 (36) | |
Type of simulator, n (%) | ||||
Not applicable | 99 (99) | 7 (35) | 10 (25) | |
Virtual reality | 1 (1) | 3 (15) | 7 (36) | |
Physical | 0 (0) | 10 (50) | 10 (25) | |
Hybrid/augmented reality | 0 (0) | 0 (0) | 1 (4) | |
Videogaming experience, n (%) | ||||
Yes | 72 (72) | 15 (75) | 14 (50) | |
No | 28 (28) | 5 (25) | 14 (50) | |
Videogaming frequency, n (%) | ||||
Not applicable | 26 (26) | 4 (20) | 14 (50) | |
Daily | 1 (1) | 1 (5) | 0 (0) | |
Weekly | 27 (27) | 3 (15) | 4 (14) | |
Monthly | 6 (6) | 3 (15) | 3 (11) | |
Occasionally | 40 (40) | 9 (45) | 7 (25) | |
Minimally invasive surgery experience, n (%) | ||||
None | 37 (37) | 3 (15) | 0 (0) | |
Basic camera manipulation | 63 (63) | 6 (30) | 0 (0) | |
Basic operator level | 0 (0) | 11 (55) | 10 (36) | |
Intermediate operator level | 0 (0) | 0 (0) | 10 (36) | |
Advanced operator level | 0 (0) | 0 (0) | 8 (29) |
Validity Hypothesis 1: Relations to Other Variables
To explore validity evidence for relations to other variables, we compared the SIMISGEST-VR test scores across experience levels (known group construct validity). No statistically significant differences were found in the scores of the first trial in each task between novices and experts; however, the times taken to perform tasks 3 (P=.006) and 6 (P=.02) were statistically significantly lower for experts compared to those of the reference group (
). Performance in task 5 was better for novices who had prior videogaming experience (P=.01), as shown in . When time was considered as a metric in task 3, a statistically significant difference (P=.006) was found between the reference group and the experts in performing the first trial ( ).Metric and task | P value | |
Score | ||
Task 1 | .58 | |
Task 2 | .13 | |
Task 3 | .33 | |
Task 4 | .18 | |
Task 5 | .77 | |
Task 6 | .27 | |
Time | ||
Task 1 | .53 | |
Task 2 | .34 | |
Task 3 | .006 | |
Task 4 | .26 | |
Task 5 | .28 | |
Task 6 | .02 |
Validity Hypothesis 2: Internal Structure
The items demonstrated high internal consistency (Cronbach α=.81). Regarding the final scores in all the tasks, no statistically significant differences were found between the first exercises and those 6 months later for the randomly selected participants from the reference group (
); when time was assessed as a metric, statistically significant differences were found for tasks 4 (trial 10: P=.048) and 6 (trial 10: P=.03). This demonstrates full evidence for the internal structure and test-retest reliability with respect to the score and partial evidence with respect to time as a metric ( ).Metric, task, and trial comparison | Spearman correlation coefficient | P value | ||||
Score (initial vs 6 months later) | ||||||
Task 1 | ||||||
Trial 1 | 0.200 | .23 | ||||
Trial 10 | –0.294 | .12 | ||||
Task 2 | ||||||
Trial 1 | 0.0846 | .66 | ||||
Trial 10 | –0.256 | .18 | ||||
Task 3 | ||||||
Trial 1 | –0.036 | .85 | ||||
Trial 10 | 0.150 | .44 | ||||
Task 4 | ||||||
Trial 1 | 0.120 | .53 | ||||
Trial 10 | 0.338 | .07 | ||||
Task 5 | ||||||
Trial 1 | 0.341 | .07 | ||||
Trial 10 | 0.035 | .85 | ||||
Task 6 | ||||||
Trial 1 | –0.321 | .09 | ||||
Trial 10 | –0.030 | .88 | ||||
Time (initial vs 6 months later) | ||||||
Task 1 | ||||||
Trial 1 | –0.005 | .98 | ||||
Trial 10 | 0.243 | .20 | ||||
Task 2 | ||||||
Trial 1 | –0.082 | .67 | ||||
Trial 10 | –0.216 | .26 | ||||
Task 3 | ||||||
Trial 1 | 0.121 | .53 | ||||
Trial 10 | 0.359 | .06 | ||||
Task 4 | ||||||
Trial 1 | –0.141 | .46 | ||||
Trial 10 | 0.370 | .048 | ||||
Task 5 | ||||||
Trial 1 | 0.271 | .16 | ||||
Trial 10 | 0.330 | .08 | ||||
Task 6 | ||||||
Trial 1 | 0.097 | .62 | ||||
Trial 10 | 0.412 | .03 |
Validity Hypothesis 3: Consequences of Testing
Among the reference group, statistically significant differences were found in the scores and the time taken to perform each task between the first and tenth trials. Among the experts, statistically significant differences were found in the scores in tasks 1 (P<.001), 3 (P=.03), and 4 (P=.01), and in the time taken to perform each task. These findings demonstrate a learning curve (
).Metric and task | Comparison trials | Novices, P value | Experts, P value | |
Score | ||||
Task 1 | 1 vs 10 | <.001 | <.001 | |
Task 2 | 1 vs 10 | .002 | .052 | |
Task 3 | 1 vs 10 | .002 | .03 | |
Task 4 | 1 vs 10 | <.001 | .01 | |
Task 5 | 1 vs 10 | <.001 | .31 | |
Task 6 | 1 vs 10 | .003 | .63 | |
Time | ||||
Task 1 | 1 vs 10 | <.001 | <.001 | |
Task 2 | 1 vs 10 | <.001 | <.001 | |
Task 3 | 1 vs 10 | <.001 | <.001 | |
Task 4 | 1 vs 10 | <.001 | <.001 | |
Task 5 | 1 vs 10 | <.001 | <.001 | |
Task 6 | 1 vs 10 | <.001 | <.001 |
In task 5, the reference group made statistically significantly fewer excess diathermy errors in the tenth trial than they did in the first trial (P=.003), which is evidence of a learning curve (
).Task and group | Trial 1, mean (95% CI) | Trial 10, mean (95% CI) | P value | |
Task 5 | ||||
Novices | 0.205 (0.127, 0.282) | 0.090 (0.031, 0.148) | .003 | |
Experts | 0.125 (0.039, 0.210) | 0.089 (0.013, 0.164) | .56 | |
Task 6 | ||||
Novices | 0.200 (0.105, 0.294) | 0.205 (0.120, 0.289) | .90 | |
Experts | 0.250 (0.049, 0.450) | 0.107 (0.010, 0.203) | .40 |
Response Process Validity
Study participants had the opportunity to observe each task in advance by watching a video, and they received basic instruction. The only feedback the participants received was from the simulator; they did not receive any other type of feedback from the instructor. Each of the 177 tests performed (148 initial tests and 29 test-retests) was supervised by the same person (FAL). Photographic documentation of every person performing the tasks was obtained. The final performance scores were defined in advance by using the formula described in another study [
]. The exercise results were stored in an SQL database light within the simulator app itself after each test.SIMISGEST-VR Simulator Cost
The Leap Motion Controller costs approximately US $130, and the hardware elements cost approximately US $70. LapSim essence (Surgical Science Sweden AB) is a portable VR simulator that enables people to learn basic skills. It does not include haptics and is not available for sale, but it can be hired for 6 months at an approximate price of US $5500. To date, there are no publications about the validity of the tasks that this device proposes.
Discussion
General
The aim of this study was to evaluate a simulation instrument—SIMISGEST-VR—and to document the sources of validity evidence for task scores, relations to other variables, internal structure, consequences of testing, and response process.
Technology-enhanced simulation is defined as an “educational tool or device with which the learner physically interacts to mimic an aspect of clinical care for the purpose of teaching or assessment [
, ].” The use of simulators for learning basic psychomotor skills in minimally invasive surgery has been supported by multiple systematic reviews [ , , - ].In the current state-of-the-art conceptual framework, validity is defined as the appropriate interpretation or use of test results and therefore applies only to the scores or interpretation in a specific context. The commonly used term valid instrument is inaccurate, and validity must be established for each intended interpretation [
, , ]. Thus, when an evaluation instrument is said to be “valid” or to have been “validated,” it is essential to take into account the learning context, the performance context, the domain content, and the exigency of decisions taken on the basis of test results [ ].Validation refers to the process of collecting validity evidence to evaluate the appropriateness of the interpretations, uses, and decisions based on assessment results [
]. Validation is, therefore, a process and not an endpoint, and it involves gathering evidence and taking decisions based on the interpretation of the data obtained. In our case, validation required a series of experiments designed to provide evidence that the scores measured in SIMISGEST-VR reflected the technical skills they purported to measure [ ].The first step in any validity evaluation is to clearly define the construct. The construct we focused on validating was training and learning basic psychomotor skills in minimally invasive surgery using a low-cost portable simulator called SIMISGEST-VR. Several systematic reviews [
, - ] have found that basic psychomotor skills can be learned in low-cost simulation models; however, low-cost simulators are often box trainers made from cardboard boxes [ ], plastic crates [ ], folding portable boxes [ ], and boxes that require the use of laparoscopic equipment [ , ] or even an iPad [ ]. There are no low-cost VR simulators on the market.An important finding from this study was the high percentage of surgical residents and surgeons that had no experience with simulators, and the very low percentage of surgical residents who had experience with hybrid and VR simulators. This finding can be explained by the high cost of this type of simulator, which, in many countries, prevents the creation of simulation centers for learning basic psychomotor skills in minimally invasive surgery and constitutes an argument in favor of exploring the development of models of low-cost portable VR simulators such as SIMISGEST-VR. Ucelli [
] demonstrated a comparable outcome between supervised simulator practice and unstructured free simulator access without mentoring and, therefore, that “take home” simulation was both viable and economically beneficial.Validity Hypothesis 1: Relations to Other Variables
It is currently considered that a comparison between reference individuals and experts does not constitute an important validity argument [
, ]. However, it is the type of evidence for relations to other variables that is most often referred to in the literature [ ]. The SIMISGEST-VR tasks were unable to demonstrate any difference in the performance scores between the reference group and the experts. A statistically significant difference was found between these 2 groups only in the time taken to complete tasks 3 (P=.006) and 6 (P=.02), which were the most complex.Although some studies support the hypothesis that videogaming experience has a positive impact on minimally invasive surgery performance [
- ]. In this study, a significant difference was found for the reference group only in task 5 (diathermy; P=.003); in the other tasks, prior experience did not have any impact on performance. The demographic characterization made it clear that frequent videogaming (daily or weekly) was low in all population groups, which can explain the absence of impact on performance.The lack of evidence for relations to other variables in this study can also be explained by the simplicity and ease of the proposed tasks.
Validity Hypothesis 2: Internal Structure
The items demonstrated high internal consistency (Cronbach α=.81). A test should not be used if it has a Cronbach α<0.7, and it should not be used for important decisions on an individual unless the Cronbach α>0.9 [
, - ]. In our case, therefore, the result enables us to support the use of SIMISGEST-VR tasks as a self-assessment test used for formative assessment [ ].Test-retest reliability is the correlation between scores for a test administered more than once among a homogeneous group of test takers at 2 different times (temporal stability); the longer the period of time, the less likely it is that a person will remember the simulator tasks and, therefore, the greater the test-retest threat will be [
- ]. In this study, the second exercise was performed 6 months after the first one, and the results obtained demonstrate significant evidence for the temporal stability of scores in the 6 tasks. When the metric used was time, similar results were obtained in all but tasks 4 (P=.048) and 6 (P=.03) when comparing the tenth trial.Validity Hypothesis 3: Evidence for Consequences of Testing
The most important finding of this study is that that the reference group learned in all the SIMISGEST-VR tasks. Excess diathermy error, defined as a contact time longer than 2 seconds from the moment of initial contact, fell significantly (P=.003) between the first and tenth trials for task 5 in the reference group, which also constitutes evidence for a learning curve. The experts group achieved a learning curve in all the tasks when time was taken as a metric, and for tasks 1 (P<.001), 3 (P=.03), and 4 (P=.01) when the final score of the test was taken into account. We, therefore, consider that the SIMISGEST-VR tasks can be used for the purpose of enabling novices without any prior experience to learn basic psychomotor skills in minimally invasive surgery.
This study has several strengths. The reference group sample included 100 students from 2 faculties of medicine, one public and one private; surgeons from a range of specialties; and surgical residents in general surgery and obstetric-gynecologic surgery. Physical simulators require the presence of a specialized tutor, a scarce, high-cost human resource, whereas VR simulators provide metrics and automatic feedback and allow the physical presence of a tutor to be dispensed with. At times of a pandemic such as COVID-19, this concept of education via VR takes on considerable significance because it avoids the need for learners to travel to simulation laboratories and, therefore, avoids close contact between students and instructors. This study also has limitations. Although the size of the reference group was large, a larger expert group would have been desirable. The sample size in our study was one of availability; as such, there are relatively more participants with minimal surgical experience compared to those with a lot of experience, such as senior surgical residents and surgeons. The low number of senior residents prevented significant results from being obtained when comparing them to the other groups. Another limitation of the data analysis in this study is that there was no statistical analysis performed before the trial to evaluate the proper sample size or to determine the Likert scale.
Conclusions
This study has provided evidence to support the use of SIMISGEST-VR as a low-cost portable tool for the purpose of enabling novices without any prior experience to learn basic psychomotor skills in minimally invasive surgery. The tasks for learning basic motor skills in minimally invasive surgery demonstrated high internal consistency and high test-retest reliability among the reference group when assessing the task scores. The expert group also managed to obtain a learning curve in all the tasks when assessing the time metric. In this study, we were able to demonstrate partial evidence for relations to other variables and strong evidence for internal structure and test consequences.
Future work streams include the creation of different levels of difficulty in the tasks. We also intend to develop an app that can be downloaded online, which contains the full training program. Finally, we hope to develop simulation models using the Leap Motion Controller and other gesture-recognition devices such as the Myo armband.
Authors' Contributions
All authors contributed substantially to the study conception and design, data analysis, interpretation of the findings, and manuscript drafting. FÁL participated in the collection and assembly of data. FA participated in the data analysis and statistical modeling. FS-R is the guarantor of the paper. All authors read, revised, and approved the final manuscript.
Conflicts of Interest
None declared.
Form for the demographic questionnaire.
DOCX File , 15 KBReferences
- Litynski GS. Profiles in laparoscopy: Mouret, Dubois, and Perissat: the laparoscopic breakthrough in Europe (1987-1988). JSLS 1999;3(2):163-167 [FREE Full text] [Medline]
- Yamashita Y, Kurohiji T, Kakegawa T. Evaluation of two training programs for laparoscopic cholecystectomy: incidence of major complications. World J Surg 1994;18(2):279-85; discussion 285. [CrossRef] [Medline]
- Archer SB, Brown DW, Smith CD, Branum GD, Hunter JG. Bile duct injury during laparoscopic cholecystectomy: results of a national survey. Ann Surg 2001 Oct;234(4):549-58; discussion 558. [CrossRef] [Medline]
- Berci G. Complications of laparoscopic cholecystectomy. Surg Endosc 1998 Apr;12(4):291-293. [CrossRef] [Medline]
- Rogers DA, Elstein AS, Bordage G. Improving continuing medical education for surgical techniques: applying the lessons learned in the first decade of minimal access surgery. Ann Surg 2001 Feb;233(2):159-166. [CrossRef] [Medline]
- Hugh TB. New strategies to prevent laparoscopic bile duct injury--surgeons can learn from pilots. Surgery 2002 Nov;132(5):826-835. [CrossRef] [Medline]
- MacFadyen BV, Vecchio R, Ricardo AE, Mathis CR. Bile duct injury after laparoscopic cholecystectomy. the United States experience. Surg Endosc 1998 Apr 10;12(4):315-321. [CrossRef] [Medline]
- Wolfe BM, Gardiner B, Frey CF. Laparoscopic cholecystectomy: a remarkable development. JAMA 2015 Oct 06;314(13):1406. [CrossRef] [Medline]
- Seymour NE, Gallagher AG, Roman SA, O'Brien MK, Bansal VK, Andersen DK, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 2002 Oct;236(4):458-63; discussion 463. [CrossRef] [Medline]
- Gallagher AG, Richie K, McClure N, McGuigan J. Objective psychomotor skills assessment of experienced, junior, and novice laparoscopists with virtual reality. World J Surg 2001 Nov;25(11):1478-1483. [CrossRef] [Medline]
- Gallagher AG, Smith CD, Bowers SP, Seymour NE, Pearson A, McNatt S, et al. Psychomotor skills assessment in practicing surgeons experienced in performing advanced laparoscopic procedures. J Am Coll Surg 2003 Sep;197(3):479-488. [CrossRef] [Medline]
- Korndorffer JR, Dunne JB, Sierra R, Stefanidis D, Touchard CL, Scott DJ. Simulator training for laparoscopic suturing using performance goals translates to the operating room. J Am Coll Surg 2005 Jul;201(1):23-29. [CrossRef] [Medline]
- Aggarwal R, Ward J, Balasundaram I, Sains P, Athanasiou T, Darzi A. Proving the effectiveness of virtual reality simulation for training in laparoscopic surgery. Ann Surg 2007 Nov;246(5):771-779. [CrossRef] [Medline]
- Sturm LP, Windsor JA, Cosman PH, Cregan P, Hewett PJ, Maddern GJ. A systematic review of skills transfer after surgical simulation training. Ann Surg 2008 Aug;248(2):166-179. [CrossRef] [Medline]
- Seymour NE. VR to OR: a review of the evidence that virtual reality simulation improves operating room performance. World J Surg 2008 Feb 3;32(2):182-188. [CrossRef] [Medline]
- Zendejas B, Brydges R, Hamstra SJ, Cook DA. State of the evidence on simulation-based training for laparoscopic surgery: a systematic review. Ann Surg 2013 Apr;257(4):586-593. [CrossRef] [Medline]
- Dawe SR, Pena GN, Windsor JA, Broeders JAJL, Cregan PC, Hewett PJ, et al. Systematic review of skills transfer after surgical simulation-based training. Br J Surg 2014 Aug;101(9):1063-1076. [CrossRef] [Medline]
- Hyltander A, Liljegren E, Rhodin P, Lönroth H. The transfer of basic skills learned in a laparoscopic simulator to the operating room. Surg Endosc 2002 Sep;16(9):1324-1328. [CrossRef] [Medline]
- Youngblood PL, Srivastava S, Curet M, Heinrichs WL, Dev P, Wren SM. Comparison of training on two laparoscopic simulators and assessment of skills transfer to surgical performance. J Am Coll Surg 2005 Apr;200(4):546-551. [CrossRef] [Medline]
- Sutton C, McCloy R, Middlebrook A, Chater P, Wilson M, Stone R. MIST VR. A laparoscopic surgery procedures trainer and evaluator. Stud Health Technol Inform 1997;39:598-607. [Medline]
- Ahlberg G, Heikkinen T, Iselius L, Leijonmarck C, Rutqvist J, Arvidsson D. Does training in a virtual reality simulator improve surgical performance? Surg Endosc 2002 Jan 12;16(1):126-129. [CrossRef] [Medline]
- Debes AJ, Aggarwal R, Balasundaram I, Jacobsen MB. A tale of two trainers: virtual reality versus a video trainer for acquisition of basic laparoscopic skills. Am J Surg 2010 Jun;199(6):840-845. [CrossRef] [Medline]
- Kothari SN, Kaplan BJ, DeMaria EJ, Broderick TJ, Merrell RC. Training in laparoscopic suturing skills using a new computer-based virtual reality simulator (MIST-VR) provides results comparable to those with an established pelvic trainer system. J Laparoendosc Adv Surg Tech A 2002 Jun;12(3):167-173. [CrossRef] [Medline]
- Torkington J, Smith S, Rees B, Darzi A. Skill transfer from virtual reality to a real laparoscopic task. Surg Endosc 2001 Oct 15;15(10):1076-1079. [CrossRef] [Medline]
- Torkington J, Smith S, Rees B, Darzi A. The role of the basic surgical skills course in the acquisition and retention of laparoscopic skill. Surg Endosc 2001 Oct 15;15(10):1071-1075. [CrossRef] [Medline]
- Alvarez-Lopez F, Maina MF, Saigí-Rubió F. Use of commercial off-the-shelf devices for the detection of manual gestures in surgery: systematic literature review. J Med Internet Res 2019 Apr 14;21(5):e11925 [FREE Full text] [CrossRef] [Medline]
- Ogura T, Sato M, Ishida Y, Hayashi N, Doi K. Development of a novel method for manipulation of angiographic images by use of a motion sensor in operating rooms. Radiol Phys Technol 2014 Jul;7(2):228-234. [CrossRef] [Medline]
- Mauser S, Burgert O. Touch-free, gesture-based control of medical devices and software based on the leap motion controller. Stud Health Technol Inform 2014;196:265-270. [Medline]
- Bachmann D, Weichert F, Rinkenauer G. Evaluation of the leap motion controller as a new contact-free pointing device. Sensors (Basel) 2014 Dec 24;15(1):214-233 [FREE Full text] [CrossRef] [Medline]
- Weichert F, Bachmann D, Rudak B, Fisseler D. Analysis of the accuracy and robustness of the leap motion controller. Sensors (Basel) 2013 May 14;13(5):6380-6393 [FREE Full text] [CrossRef] [Medline]
- Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J. An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors (Basel) 2014 Feb;14(2):3702-3720 [FREE Full text] [CrossRef] [Medline]
- Alvarez-Lopez F, Maina MF, Saigí-Rubió F. Natural User Interfaces: Is It a Solution to Accomplish Ubiquitous Training in Minimally Invasive Surgery? Surg Innov 2016 Aug 09;23(4):429-430. [CrossRef] [Medline]
- Oropesa I, de Jong T, Sánchez-González P, Dankelman J, Gómez E. Feasibility of tracking laparoscopic instruments in a box trainer using a Leap Motion Controller. Measurement 2016 Feb;80:115-124. [CrossRef]
- Beck P. Accurate three-dimensional instrument positioning. Free Patents Online. 2016. URL: http://www.freepatentsonline.com/20160354152.pdf [accessed 2020-10-15]
- Lahanas V, Loukas C, Georgiou K, Lababidi H, Al-Jaroudi D. Virtual reality-based assessment of basic laparoscopic skills using the Leap Motion controller. Surg Endosc 2017 Dec 2;31(12):5012-5023. [CrossRef] [Medline]
- Travaglini T, Swaney P, Weaver K, Webster R. Initial experiments with the leap motion as a user interface in robotic endonasal surgery. Robot Mechatron (2015) 2016;37:171-179 [FREE Full text] [CrossRef] [Medline]
- Juanes JA, Gómez JJ, Peguero PD, Ruisoto P. Digital environment for movement control in surgical skill training. J Med Syst 2016 Jun 18;40(6):133. [CrossRef] [Medline]
- Partridge RW, Brown FS, Brennan PM, Hennessey IAM, Hughes MA. The LEAPTM gesture interface device and take-home laparoscopic simulators: a study of construct and concurrent validity. Surg Innov 2016 Feb 14;23(1):70-77. [CrossRef] [Medline]
- Wright T, de Ribaupierre S, Eagleson R. Design and evaluation of an augmented reality simulator using leap motion. Healthc Technol Lett 2017 Oct;4(5):210-215 [FREE Full text] [CrossRef] [Medline]
- Botden SMBI, Jakimowicz JJ. What is going on in augmented reality simulation in laparoscopic surgery? Surg Endosc 2009 Aug;23(8):1693-1700 [FREE Full text] [CrossRef] [Medline]
- Papanikolaou IG. Assessment of medical simulators as a training programme for current surgical education. Hellenic J Surg 2013 Aug 9;85(4):240-248. [CrossRef]
- Hennessey IA, Hewett P. Construct, concurrent, and content validity of the eoSim laparoscopic simulator. J Laparoendosc Adv Surg Tech A 2013 Oct;23(10):855-860. [CrossRef] [Medline]
- Hruby GW, Sprenkle PC, Abdelshehid C, Clayman RV, McDougall EM, Landman J. The EZ Trainer: validation of a portable and inexpensive simulator for training basic laparoscopic skills. J Urol 2008 Feb;179(2):662-666. [CrossRef] [Medline]
- Uccelli J, Kahol K, Ashby A, Smith M, Ferrara J. The validity of take-home surgical simulators to enhance resident technical skill proficiency. Am J Surg 2011 Mar;201(3):315-9; discussion 319. [CrossRef] [Medline]
- Estándares Para Pruebas Educativas y Psicológicas. JSTOR.: American Educational Research Association; 2018. URL: www.jstor.org/stable/j.ctvr43hg2 [accessed 2020-10-17]
- American Educational Research Association, American Psychological Association, National Council on Measurement in Education. Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association; 2014.
- Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ 2003 Sep;37(9):830-837. [CrossRef] [Medline]
- Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med 2006 Feb;119(2):166.e7-166.16. [CrossRef] [Medline]
- Cook DA, Zendejas B, Hamstra SJ, Hatala R, Brydges R. What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment. Adv Health Sci Educ Theory Pract 2014 May;19(2):233-250. [CrossRef] [Medline]
- Messick S. Validity of psychological assessment: Validation of inferences from persons' responses and performances as scientific inquiry into score meaning. American Psychologist 1995 Sep;50(9):741-749. [CrossRef]
- Messick S. Meaning and values in test validation: the science and ethics of assessment. Educational Researcher 2016 Jul;18(2):5-11. [CrossRef]
- Cook DA, Hatala R. Validation of educational assessments: a primer for simulation and beyond. Adv Simul (Lond) 2016;1:31 [FREE Full text] [CrossRef] [Medline]
- Alvarez-Lopez F, Maina MF, Saigí-Rubió F. Use of a low-cost portable 3d virtual reality gesture-mediated simulator for training and learning basic psychomotor skills in minimally invasive surgery: development and content validity study. J Med Internet Res 2020 Jul 14;22(7):e17491 [FREE Full text] [CrossRef] [Medline]
- Ghaderi I, Manji F, Park YS, Juul D, Ott M, Harris I, et al. Technical skills assessment toolbox: a review using the unitary framework of validity. Ann Surg 2015 Feb;261(2):251-262. [CrossRef] [Medline]
- Beckman TJ, Cook DA, Mandrekar JN. What is the validity evidence for assessments of clinical teaching? J Gen Intern Med 2005 Dec;20(12):1159-1164. [CrossRef] [Medline]
- Sweet R, Hananel D, Lawrenz F. A unified approach to validation, reliability, and education study design for surgical technical skills training. Arch Surg 2010 Feb;145(2):197-201. [CrossRef] [Medline]
- Gallagher AG, Ritter EM, Satava RM. Fundamental principles of validation, and reliability: rigorous science for the assessment of surgical education and training. Surg Endosc 2003 Oct 1;17(10):1525-1529. [CrossRef] [Medline]
- Worster A, Haines T. Advanced statistics: understanding medical record review (MRR) studies. Acad Emerg Med 2004 Feb;11(2):187-192. [Medline]
- Warner B. The sciences of the artificial. Journal of the Operational Research Society 2017 Dec 19;20(4):509-510 [FREE Full text] [CrossRef]
- Manson N. Is operations research really research? ORiON 2006 Dec 01;22(2). [CrossRef]
- Dresch A, Pacheco-Lacerda D, Valle-Antunes J. Design science research. In: A Method for Science and Technology Advancement. Switzerland: Springer International Publishing; 2015.
- Lacerda DP, Dresch A, Proença A, Antunes Júnior JAV. Design Science Research: método de pesquisa para a engenharia de produção. Gest. Prod 2013 Nov 26;20(4):741-761. [CrossRef]
- Hevner A, Chatterjee S. Design research in information systems. In: Theory and Practice. Boston, MA: Springer US; 2010.
- Wilson M, Middlebrook A, Sutton C, Stone R, McCloy R. MIST VR: a virtual reality trainer for laparoscopic surgery assesses performance. Ann R Coll Surg Engl 1997 Nov;79(6):403-404 [FREE Full text] [Medline]
- Ali M, Mowery Y, Kaplan B, DeMaria E. Training the novice in laparoscopy. More challenge is better. Surg Endosc 2002 Dec 1;16(12):1732-1736. [CrossRef] [Medline]
- Grantcharov TP, Rosenberg J, Pahle E, Funch-Jensen P. Virtual reality computer simulation. Surg Endosc 2001 Mar;15(3):242-244. [CrossRef] [Medline]
- Hamilton E, Scott D, Fleming J, Rege R, Laycock R, Bergen P, et al. Comparison of video trainer and virtual reality training systems on acquisition of laparoscopic skills. Surg Endosc 2002 Mar;16(3):406-411. [CrossRef] [Medline]
- Grantcharov TP, Kristiansen VB, Bendix J, Bardram L, Rosenberg J, Funch-Jensen P. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg 2004 Feb 29;91(2):146-150. [CrossRef] [Medline]
- Gonzalez R, Bowers SP, Smith CD, Ramshaw BJ. Does setting specific goals and providing feedback during training result in better acquisition of laparoscopic skills? Am Surg 2004 Jan;70(1):35-39. [Medline]
- McClusky DA, Gallagher AG, Ritter E, Lederman AB, Van Sickle KR, Baghai M, et al. Virtual reality training improves junior residents’ operating room performance: Results of a prospective, randomized, double-blinded study of the complete laparoscopic cholecystectomy. Journal of the American College of Surgeons 2004 Sep;199(3):73. [CrossRef]
- Grantcharov TP, Funch-Jensen P. Can everyone achieve proficiency with the laparoscopic technique? Learning curve patterns in technical skills acquisition. Am J Surg 2009 Apr;197(4):447-449. [CrossRef] [Medline]
- Gallagher AG, Seymour NE, Jordan-Black J, Bunting BP, McGlade K, Satava RM. Prospective, Randomized Assessment of Transfer of Training (ToT) and Transfer Effectiveness Ratio (TER) of Virtual Reality Simulation Training for Laparoscopic Skill Acquisition. Annals of Surgery 2013;257(6):1025-1031. [CrossRef]
- Badash I, Burtt K, Solorzano CA, Carey JN. Innovations in surgery simulation: a review of past, current and future techniques. Ann Transl Med 2016 Dec;4(23):453-453 [FREE Full text] [CrossRef] [Medline]
- Taffinder N, Sutton C, Fishwick RJ, McManus IC, Darzi A. Validation of virtual reality to teach and assess psychomotor skills in laparoscopic surgery: results from randomised controlled studies using the MIST VR laparoscopic simulator. Stud Health Technol Inform 1998;50:124-130. [Medline]
- Chaudhry A, Sutton C, Wood J, Stone R, McCloy R. Learning rate for laparoscopic surgical skills on MIST VR, a virtual reality simulator: quality of human-computer interface. Ann R Coll Surg Engl 1999 Jul;81(4):281-286 [FREE Full text] [Medline]
- Stylopoulos N, Vosburgh KG. Assessing technical skill in surgery and endoscopy: a set of metrics and an algorithm (C-PASS) to assess skills in surgical and endoscopic procedures. Surg Innov 2007 Jun;14(2):113-121. [CrossRef] [Medline]
- Hatala R, Cook DA, Zendejas B, Hamstra SJ, Brydges R. Feedback for simulation-based procedural skills training: a meta-analysis and critical narrative synthesis. Adv Health Sci Educ Theory Pract 2014 May;19(2):251-272. [CrossRef] [Medline]
- Gallagher A, Satava R. Virtual reality as a metric for the assessment of laparoscopic psychomotor skills. Learning curves and reliability measures. Surg Endosc 2002 Dec 1;16(12):1746-1752. [CrossRef] [Medline]
- Archer J. State of the science in health professional education: effective feedback. Med Educ 2010 Jan;44(1):101-108. [CrossRef] [Medline]
- Walsh CM, Ling SC, Wang CS, Carnahan H. Concurrent versus terminal feedback: it may be better to wait. Academic Medicine 2009;84(Supplement):S54-S57. [CrossRef] [Medline]
- Grantcharov TP, Schulze S, Kristiansen VB. The impact of objective assessment and constructive feedback on improvement of laparoscopic performance in the operating room. Surg Endosc 2007 Dec 13;21(12):2240-2243. [CrossRef] [Medline]
- Stefanidis D, Korndorffer JR, Heniford BT, Scott DJ. Limited feedback and video tutorials optimize learning and resource utilization during laparoscopic simulator training. Surgery 2007 Aug;142(2):202-206. [CrossRef] [Medline]
- Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Technology-enhanced simulation to assess health professionals: a systematic review of validity evidence, research methods, and reporting quality. Acad Med 2013 Jun;88(6):872-883. [CrossRef] [Medline]
- Cook DA, Hatala R, Brydges R, Zendejas B, Szostek JH, Wang AT, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011 Sep 07;306(9):978-988. [CrossRef] [Medline]
- Haque S, Srinivasan S. A meta-analysis of the training effectiveness of virtual reality surgical simulators. IEEE Trans Inf Technol Biomed 2006 Jan;10(1):51-58. [CrossRef] [Medline]
- Brydges R, Hatala R, Zendejas B, Erwin PJ, Cook DA. Linking simulation-based educational assessments and patient-related outcomes: a systematic review and meta-analysis. Acad Med 2015 Feb;90(2):246-256. [CrossRef] [Medline]
- Nagendran M, Gurusamy KS, Aggarwal R, Loizidou M, Davidson BR. Virtual reality training for surgical trainees in laparoscopic surgery. Cochrane Database Syst Rev 2013 Aug 27(8):CD006575 [FREE Full text] [CrossRef] [Medline]
- Thomas MP. The role of simulation in the development of technical competence during surgical training: a literature review. Int J Med Educ 2013 Mar 16;4:48-58. [CrossRef]
- Dawe SR, Windsor JA, Broeders JA, Cregan PC, Hewett PJ, Maddern GJ. A systematic review of surgical skills transfer after simulation-based training: laparoscopic cholecystectomy and endoscopy. Ann Surg 2014 Feb;259(2):236-248. [CrossRef] [Medline]
- Gurusamy K, Aggarwal R, Palanivelu L, Davidson BR. Systematic review of randomized controlled trials on the effectiveness of virtual reality training for laparoscopic surgery. Br J Surg 2008 Sep;95(9):1088-1097. [CrossRef] [Medline]
- Andreatta P, Gruppen L. Conceptualising and classifying validity evidence for simulation. Med Educ 2009 Nov;43(11):1028-1035. [CrossRef] [Medline]
- Fried GM, Feldman LS, Vassiliou MC, Fraser SA, Stanbridge D, Ghitulescu G, et al. Proving the value of simulation in laparoscopic surgery. Ann Surg 2004 Sep;240(3):518-25; discussion 525. [CrossRef] [Medline]
- Li MM, George J. A systematic review of low-cost laparoscopic simulators. Surg Endosc 2017 Jan;31(1):38-48 [FREE Full text] [CrossRef] [Medline]
- Nguyen T, Braga LH, Hoogenes J, Matsumoto ED. Commercial video laparoscopic trainers versus less expensive, simple laparoscopic trainers: a systematic review and meta-analysis. J Urol 2013 Sep;190(3):894-899. [CrossRef] [Medline]
- Nagendran M, Toon C, Davidson B, Gurusamy K. Laparoscopic surgical box model training for surgical trainees with no prior laparoscopic experience. Cochrane Database Syst Rev 2014 Jan 17(1):CD010479. [CrossRef] [Medline]
- Diesen DL, Erhunmwunsee L, Bennett KM, Ben-David K, Yurcisin B, Ceppa EP, et al. Effectiveness of laparoscopic computer simulator versus usage of box trainer for endoscopic surgery training of novices. J Surg Educ 2011 Jul;68(4):282-289. [CrossRef] [Medline]
- Blacker AJ. How to build your own laparoscopic trainer. J Endourol 2005 Jul;19(6):748-752. [CrossRef] [Medline]
- Smith MD, Norris JM, Kishikova L, Smith DP. Laparoscopic simulation for all: two affordable, upgradable, and easy-to-build laparoscopic trainers. J Surg Educ 2013 Mar;70(2):217-223. [CrossRef] [Medline]
- Nakamura LY, Martin GL, Fox JC, Andrews PE, Humphreys M, Castle EP. Comparing the portable laparoscopic trainer with a standardized trainer in surgically naïve subjects. J Endourol 2012 Jan;26(1):67-72. [CrossRef] [Medline]
- Ricchiuti D, Ralat DA, Evancho-Chapman M, Wyneski H, Cerone J, Wegryn JD. A simple cost-effective design for construction of a laparoscopic trainer. J Endourol 2005 Oct;19(8):1000-2; discussion 1002. [CrossRef] [Medline]
- Mughal M. A cheap laparoscopic surgery trainer. Ann R Coll Surg Engl 1992 Jul;74(4):256-257 [FREE Full text] [Medline]
- Ruparel RK, Brahmbhatt RD, Dove JC, Hutchinson RC, Stauffer JA, Bowers SP, et al. "iTrainers"--novel and inexpensive alternatives to traditional laparoscopic box trainers. Urology 2014 Jan;83(1):116-120. [CrossRef] [Medline]
- Cook DA. Much ado about differences: why expert-novice comparisons add little to the validity argument. Adv Health Sci Educ Theory Pract 2015 Aug;20(3):829-834. [CrossRef] [Medline]
- Zendejas B, Ruparel RK, Cook DA. Validity evidence for the Fundamentals of Laparoscopic Surgery (FLS) program as an assessment tool: a systematic review. Surg Endosc 2016 Feb;30(2):512-520. [CrossRef] [Medline]
- Borgersen NJ, Naur TMH, Sørensen SMD, Bjerrum F, Konge L, Subhi Y, et al. Gathering validity evidence for surgical simulation: a systematic review. Ann Surg 2018 Jun;267(6):1063-1068. [CrossRef] [Medline]
- Rosser JC, Lynch PJ, Cuddihy L, Gentile DA, Klonsky J, Merrell R. The impact of video games on training surgeons in the 21st century. Arch Surg 2007 Feb 01;142(2):181-6; discusssion 186. [CrossRef] [Medline]
- van Dongen KW, Verleisdonk EMM, Schijven MP, Broeders IAMJ. Will the Playstation generation become better endoscopic surgeons? Surg Endosc 2011 Jul 17;25(7):2275-2280 [FREE Full text] [CrossRef] [Medline]
- Grantcharov TP, Bardram L, Funch-Jensen P, Rosenberg J. Impact of hand dominance, gender, and experience with computer games on performance in virtual reality laparoscopy. Surg Endosc 2003 Jul 1;17(7):1082-1085. [CrossRef] [Medline]
- Boyle E, Kennedy A, Traynor O, Hill ADK. Training surgical skills using nonsurgical tasks--can Nintendo Wii™ improve surgical performance? J Surg Educ 2011;68(2):148-154. [CrossRef] [Medline]
- Harper JD, Kaiser S, Ebrahimi K, Lamberton GR, Hadley HR, Ruckle HC, et al. Prior video game exposure does not enhance robotic surgical performance. J Endourol 2007 Oct;21(10):1207-1210. [CrossRef] [Medline]
- Glaser AY, Hall CB, Uribe JI, Fried MP. The effects of previously acquired skills on sinus surgery simulator performance. Otolaryngol Head Neck Surg 2005 Oct;133(4):525-530. [CrossRef] [Medline]
- Madan AK, Frantzides CT, Park WC, Tebbit CL, Kumari NVA, O'Leary PJ. Predicting baseline laparoscopic surgery skills. Surg Endosc 2005 Jan;19(1):101-104. [CrossRef] [Medline]
- Kennedy A, Boyle E, Traynor O, Walsh T, Hill A. Video gaming enhances psychomotor skills but not visuospatial and perceptual abilities in surgical trainees. J Surg Educ 2011 Sep;68(5):414-420. [CrossRef] [Medline]
- Wanzel KR, Ward M, Reznick RK. Teaching the surgical craft: from selection to certification. Curr Probl Surg 2002 Jun;39(6):573-659. [CrossRef] [Medline]
- Cronbach LJ, Meehl PE. Construct validity in psychological tests. Psychological Bulletin 1955;52(4):281-302. [CrossRef]
- Feldman LS, Sherman V, Fried GM. Using simulators to assess laparoscopic competence: ready for widespread use? Surgery 2004 Jan;135(1):28-42. [CrossRef] [Medline]
- Straub D, Gefen D. Validation Guidelines for IS Positivist Research. CAIS 2004;13(1):427. [CrossRef]
- McDougall EM. Validation of surgical simulators. J Endourol 2007 Mar;21(3):244-247. [CrossRef] [Medline]
- Kowalewski K, Hendrie JD, Schmidt MW, Garrow CR, Bruckner T, Proctor T, et al. Development and validation of a sensor- and expert model-based training system for laparoscopic surgery: the iSurgeon. Surg Endosc 2017 May;31(5):2155-2165. [CrossRef] [Medline]
Abbreviations
MIST-VR: Minimally Invasive Surgery Training - Virtual Reality |
SIMISGEST-VR: Simulator of Minimally Invasive Surgery mediated by Gestures - Virtual Reality |
VR: virtual reality |
Edited by G Eysenbach; submitted 29.04.20; peer-reviewed by S Jung, M Adly, WK Ming; comments to author 19.07.20; revised version received 06.09.20; accepted 02.10.20; published 27.10.20
Copyright©Fernando Alvarez-Lopez, Marcelo Fabián Maina, Fernando Arango, Francesc Saigí-Rubió. Originally published in JMIR Serious Games (http://games.jmir.org), 27.10.2020.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Serious Games, is properly cited. The complete bibliographic information, a link to the original publication on http://games.jmir.org, as well as this copyright and license information must be included.