Open access peer-reviewed chapter

Virtual and Augmented Reality in Medical Education

Written By

Panteleimon Pantelidis, Angeliki Chorti, Ioanna Papagiouvanni, Georgios Paparoidamis, Christos Drosos, Thrasyvoulos Panagiotakopoulos, Georgios Lales and Michail Sideris

Submitted: 21 May 2017 Reviewed: 26 October 2017 Published: 20 December 2017

DOI: 10.5772/intechopen.71963

From the Edited Volume

Medical and Surgical Education - Past, Present and Future

Edited by Georgios Tsoulfas

Chapter metrics overview

3,584 Chapter Downloads

View Full Metrics

Abstract

Virtual reality (VR) and augmented reality (AR) are two contemporary simulation models that are currently upgrading medical education. VR provides a 3D and dynamic view of structures and the ability of the user to interact with them. The recent technological advances in haptics, display systems, and motion detection allow the user to have a realistic and interactive experience, enabling VR to be ideal for training in hands-on procedures. Consequently, surgical and other interventional procedures are the main fields of application of VR. AR provides the ability of projecting virtual information and structures over physical objects, thus enhancing or altering the real environment. The integration of AR applications in the understanding of anatomical structures and physiological mechanisms seems to be beneficial. Studies have tried to demonstrate the validity and educational effect of many VR and AR applications, in many different areas, employed via various hardware platforms. Some of them even propose a curriculum that integrates these methods. This chapter provides a brief history of VR and AR in medicine, as well as the principles and standards of their function. Finally, the studies that show the effect of the implementation of these methods in different fields of medical training are summarized and presented.

Keywords

  • virtual reality
  • augmented reality
  • medical education
  • simulation
  • advanced learning

1. Introduction

Virtual reality (VR) and augmented reality (AR) are the current trends in medical education. VR is the virtual construction of an artificial world. The key element of VR is the high level of user’s immersion over the virtual environment, namely, the high level of structures’ fidelity, as well as the interaction of the user with them in a realistic manner. This has become recently available, as it requires high standards of certain technologies, including advanced haptic devices with force-feedback capabilities (bidirectional stimuli), high-resolution audio-visual effects, motion detection technology, and high-performance processing power to transmit, and processes all this information with near-zero latency.

The first virtual system in medicine was introduced in 1965 by Robert Mann, in order to facilitate a new training environment for orthopedic. In the late 1980s, the head-mounted display (HMD) was introduced as a wearable device for VR visualizations in medicine [1]. The first pioneering applications in medical education were some on hands-on procedures that appeared over a decade later [2, 3]. In the 1990s, many adverse effects from the use of VR were reported. Nausea, dizziness, temporarily impaired vision, and lack in the sense of presence, even after 20 min of use, were some of them. The adverse effects were attributed to technical defects, such as the lag time and the inability of the human eye to fixate in-depth to “artificially distant” 3D objects [4, 5]. Although these effects are minor and subside quickly nowadays, the risk of learning inappropriate handling moves, when low-fidelity or badly simulated models are used and particularly under unsupervised teaching, still remains [6].

Although AR and VR share many technical aspects, AR differs from VR as its target is not to construct a fully artificial environment but to overlay computer-generated images onto images of the real world [7]. Therefore, it uses machines that allow physical view of the surrounding environment to be visible but enhanced with virtual images. Tablets, mobile phones, AR glasses, and other optimized devices can be employed as hardware for running AR applications. Historically, the development of AR starts in the 1960s, but the term “augmented reality” was established in 1990. The increase in the use of AR came along with the technological advances that made it available and useful. Nowadays, AR is used widely in the clinical setting, providing extra information for the clinician during interventional procedures (CT/MRI guidance, visualization of paths), but it is also applied in the educational world. Anatomy, with the 3D visualization of hardly comprehensible structures, and physiology, with the representation of mechanisms in 4D (in space and time dimensions), are the main areas of interest [8].

Advertisement

2. Technical aspects

VR and AR come with certain hardware requirements in order to retain high standards of simulation. The computerized 3D pictures and audio have to be realistic and simulate both real and abstract structures. The motion of the user should be detected with high precision so that the visual field (size, shape, angle of objects) and auditory stimuli (volume, sound balance) can change accordingly. The user must be able to affect the virtual environment but also to be affected via haptic feedback stimuli. Haptic devices, such as joysticks, gloves, and other specialized tools, serve these haptic interaction needs. High-performance computer power is needed to process the huge amount of information produced, with low, unnoticeable latency. All these contribute to the key element of VR and AR: immersion of the user over the environment. Moreover, fast and reliable Internet connection is needed to support the latest trend in VR field, namely, the establishment of VR fora where users meet and interact with each other online, in a common virtual environment [1, 3, 9].

Many VR settings involve HMD. HMD is a wearable device that offers visualization of the constructed virtual reality in wide angle and does not permit the disturbance of the virtual experience by external visual stimuli. Although the first HMDs had a restricted resolution of 800 × 600 pixels and a narrow vision angle of 30°, the current HMDs display fully high-definition images, with angles that can reach 360°. Some of them are wireless and may incorporate position and motion detection systems for the eye and head movements. Although HMDs play a significant role in many VR settings, they are not mandatory for all medical applications. The simulation of many procedures, such as laparoscopic or endoscopic, requires the display of the virtual images on a computer screen [10]. Sound quality is also important for a complete VR experience. Volume and balance adjustments according to the moves of the users are significant elements of the VR environment [1].

Input devices consist of all the devices that transmit the user’s stimuli to the VR system. Gloves, joysticks, or other specialized tools (laparoscopic or endoscopic instruments) are usually employed. A special subcategory of necessary input devices for VR is sensors and trackers. Sensors and trackers identify the position and the direction of the user in space as an input stimulus, allowing interaction. They employ various technologies such as lasers, infrared radiation, and mechanical detectors and can detect the properties of the whole body, the head, or even the subtle gestures and moves of the hands, ability that is extremely useful in medical procedures’ simulation. The recent technological improvement of these devices, which made the VR setting suitable for medical training, is the tactile-feedback ability. Adjusted force is exerted via the handling instruments, depending on the hand moves. Thus, the operator has a more realistic, bidirectional haptic experience when performing a virtual surgical process or other interventional processes [10].

Most AR hardware use glasses that project virtual 3D images onto the real environment. While Google Glass and Epson Smart Glasses were the pioneering devices, newer devices, such as Hololens (Microsoft), combine AR glasses with a system of tracking cameras and sensors. Tablets are an alternative platform suitable for AR applications. While the camera captures the physical world, virtual structures are added in the screen [10].

Advertisement

3. Quality control and educational impact

VR and AR simulators must possess quality standards so as to be suitable for medical education purposes. However, even if these standards are met, the educational benefit is still not guaranteed a priori. There are many published studies that try to test VR and AR simulators in specific educational settings (concerning different medical fields and stages of training) in order to examine their validity, the transferability of the taught skills to the real world, the acceleration of the learning curve, and the retention period of the skills. Following the determination of these parameters, establishing a curriculum that integrates this evidence in a productive and cost-efficient manner is an essential consequence.

High-resolution images, quality in sound, haptic input and feedback devices, and high processing power are the parameters that are required to create realistic VR/AR environment. Moreover, the structures of the organs and tissues must be of high fidelity, and the change in their shape, size, and angle of view must correspond completely to the user’s moves and handlings. Otherwise, there is a risk that the moves and skills learnt will be inappropriate in real situations [6]. ISO criteria have also been established. They include initial assessment of the educational needs and establishment of a technical solution. This requires the in-depth knowledge of the procedures or mechanisms that will be simulated. After the concept generation, the setting must be validated, in order to examine if it meets the educational expectations. Moreover, technical experts, medical experts, representatives from end-user groups, human factor experts, and designers should collaborate and form an interdisciplinary team that will design, evaluate, and upgrade the VR/AR products [11, 12].

Furthermore, VR and AR systems meeting these standards have to undergo post-market validation studies, in order to examine if they provide the educational effect they are expected to. In terms of medical education, the studies examine the following types of validity: (1) face validity—usually the end users make subjective comments as feedback for the quality and realism of the setting. (2) Content validity: once again, it is subjectively defined by the users’ comments but with emphasis to the content of the simulation (procedures, fidelity of organs, etc.). (3) Construct validity: this employs objective statistical methods that aim to correlate the actual skill level of the participants with their performance in the VR setting. Usually, there are two or more groups that differ only to the level of skills. Their performance in the VR/AR setting is evaluated, either by external human evaluators or, if it is feasible, by the VR system itself. If there is a correlation, that means that the VR setting can effectively discriminate between novices and experienced users. This is extremely useful when using VR/AR settings as assessment tools for the progress of trainees, as they provide an objective evaluation that may not require the need of human evaluators. (4) Concurrent validity: another objective validity control that aims to correlate the performance in the VR/AR setting with other existing evaluation tools. This practically checks whether the VR/AR evaluation results point to the same direction with other evaluation tools that are considered as gold standards. (5) Predictive validity: the last but probably most useful objective validity control. Two homogenous cohorts of trainees are formed. After that, only one receives a specific VR/AR training, with predefined parameters (such as the simulator model, the procedure/topic learnt, the time, the frequency and the difficulty level of the procedure, and the presence of a supervisor). On the contrary, the other cohort receives training with differently optimized parameters (perhaps another VR/AR simulator or even a completely different simulation method) or does not receive training at all and is used as a control group. Subsequently, the skills of both cohorts are evaluated in real patients or other simulation settings that are considered as gold standards with proven discriminating ability. If a correlation exists, the VR setting can have an educational effect that can be indeed transferred to real patients. Some studies evaluate the effect of the training after a long time period to examine the retention period of the skills [8, 13].

If there is confirmed predictive validity of a VR/AR simulator for acquiring a specific skill set, then the next step is to wonder: “How much does the VR/AR training contribute to the acceleration of the acquired skills?” This is a more complex question that can be approached with learning curves. A learning curve is a graph that correlates the level of acquired skills with the amount of effort exerted (measured in time, number of trials or procedures performed). In general, it is desired that the learning curve has a high plateau, meaning that the final level of skills acquired with this method will be high, but also is steep, meaning that there is an early high acceleration of skills with little effort. If a trainee that has completed the training in the VR setting (namely, he/she has reached the plateau of the VR/AR learning curve) and then is able to receive training in other settings (e.g., other simulation settings or real patients) from a higher than the basic level, then we assume that the VR/AR training offers transferable skills and has “accelerated” the learning curve. The common characteristics of the learning curves on most VR/AR simulators are the low plateau, the transferability of the skills, and the steep shape. Low plateau means that the total level of skills that can be acquired is lower than the one with other education methods; thus, the VR/AR is usually appropriate for novices, but is not of any significant benefit for more experienced users. Fortunately, it usually has a steep shape and transferability properties, meaning that these skills are acquired quickly and then can be transferred to other education settings. In other words, VR/AR training compensates for some training period in the initial steps of other educational settings.

After these VR/AR setting-specific properties are evaluated, the next step is to define a curriculum that incorporates VR/AR methods in education. This curriculum must also take into account the cost of both VR/AR and alternative ways of training, as well as the patient risk for both settings. For example, if there is a high risk in training with real patients or high cost with animal-simulation models, a VR/AR simulator (with low plateau and steep learning curve) can compensate quickly for the initial steps, offering a safe, relatively inexpensive, and reproducible alternative choice that may provide transferrable skills.

As there are many validation and evaluating studies in literature, an overall profile of VR/AR applications can be formed. In general, it seems that VR offers an appropriate environment for training in hands-on procedures. As Table 1 shows, laparoscopic surgery, neurosurgery, catheterization techniques, and endoscopic procedures seem to be the main fields of VR applications. Novices are the most benefited trainees, while there is not a significant learning effect in experienced users. VR offers great diversity with only one setting. The educational setting can be reproducible with a variety of different optimization details, not restricted by the physical world [1]. Although VR simulators are expensive to purchase, they can be used recurrently with low cost, require less staff than other methods (animal models or real patients), and guarantee no risk for the patient [8, 12]. However, there are certain disadvantages in their use. In the common case of crashing, the virtual experience is destroyed. From a technical point of view, they are still not ready to present a virtual environment with absolute fidelity, as if it was real. Last but not least, wrong handling can be learnt, particularly under unsupervised learning. The VR simulators can reconstruct a whole virtual surgery, specific procedural tasks, or even more abstract procedures [2].

Fields of educational trainingVR and AR simulator examples
Physiology and anatomyVisible Human Project, Visible Korean Human, The Virtual Body, The Virtual Human Embryo, The Visible Human Server, AR glasses, mobile phone and tablet based AR applications
Open surgeryVirtual Reality Educational Surgical Tools
Laparoscopic surgeryMIST-VR, LaparoscopyVR™, LapMentor™, LapSim™, SINERGIA, Xitact LS500®, ProMIS®
Robotic surgeryRoSS™, DV-Trainer®, SEP Robot, da Vinci Skills Simulator (dVSS) ™
Oesophagogastroduodenoscopy, colonoscopy, ERCPGI Mentor™, EndoVR™, Olympus Endo TS-1
NeurosurgeryNeuroVR (NeuroTouch, NeuroTouch Cranio)™, ImmersiveTouch®, RoboSim, Vascular Intervention Simulation Trainer®, EasyGuide Neuro, ANGIO Mentor™, VIVENDI, Dextroscope®, Anatomical Simulator for Pediatric Neurosurgery
Interventional cardiology and cardiothoracic surgeryANGIO Mentor™, Vascular Intervention Simulation Trainer (VIST)®, Vimedix (equipped with Hololens) ™, Nakao Cardiac Model, Minimally Invasive Cardiac Surgery Simulator, dVSS™, EchoCom
UrologyURO Mentor™, University of Washington TURP Trainer, UROSim™, PelvicVisionTURP simulator, GreenLight laser simulator, Kansai HoLEP, ProMIS®
OrthopedicsImmersiveTouch®, Phantom haptics interface®, Gaumard HAL S2001® and S3000® Mannequins, Novint Falcon®, Medtronic model, Arthro-VR®, Arthro MENTOR™, ArthroSIM, ArthroS™
Endovascular surgeryANGIO Mentor™, Vascular Intervention Simulation Trainer (VIST)®, Cardio CT, SimSuite, Compass 2™
Obstetrics and gynecologyHystSim™, EssureSim™, AccuTouch (and newly version from CAE Healthcare), MIST-VR, LapSim™
ENTOtoSim™, VOXEL-MAN (supports Phantom haptics), Ohio State University surgical simulator, Stanford Surgical Simulator, Mediseus, ImmersiveTouch, Endoscopic Sinus Surgery Simulator (supports Phantom haptics), Dextroscope®, dVSS ™
OphthalmologyEyeSi®, MicrovisTouch™, PhacoVision®
Intubation and bronchoscopyEndoVR Simulator™, BRONCH Mentor™, ORSIM®

Table 1.

VR and AR simulator examples with respect to each educational field.

Moreover, many of the simulators have incorporated assessment tools of the user’s performance. They evaluate parameters like path distance, operation time, moves, and errors and then deliver quantified, timely, and objective metrics. If these simulators meet construct validity criteria, they can be used to objectively assess the trainee’s performance. However, if this ability is not supported, human experts can also play the role of the evaluator, and studies have also shown acceptable metrics [8].

Although AR finds many clinical applications in medicine, its implementation in medical education is not so extensive (see Table 1). However, tablets and mobile phones have AR applications that enrich traditional reading with pop-up videos, links, and interactive material [14]. Moreover, in medical training there are AR applications that project traumas and other lesions onto healthy humans, aiming at training students and residents in the management of these situations. However, interventional procedures are not performed, and the AR applications are restricted in teaching the theoretical context of these subjects in an increased reality setting [15].

Advertisement

4. Applications in educational fields

4.1. Preclinical teaching

Anatomy and physiology are the classic paradigms that have the majority of VR/AR applications [6]. A checkpoint in the development of VR/AR applications that study human structures and mechanisms is the construction of massive online databases that contain human images and information, mostly obtained with CT/MRI scanning. The first relevant project, called the Visible Human Project, was created by the University of Colorado in 1991. The male and female versions of the project contain over 7000 digital anatomical images and occupy over 50 gigabytes of space. The National Library of Medicine made this platform free and accessible [6, 16]. Other similar models followed, such as the Korean model Visible Korean Human, The Virtual Body, The Virtual Human Embryo, and The Visible Human Server. Anatomy teaching was enhanced with VR applications that obtained data from these databases. Similar effort was put to reconstruct intracellular organelles in physiology [2, 10]. Structures and mechanisms are digitally simulated and presented in four dimensions (space and time). This contemporary visualization can be called “in silico” biology. Examples include the dynamic simulation of neurons, membranes, and cardiovascular system parameters including heart rate, blood pressure, contractility, and vascular resistance [16]. Anatomy also uses tablet-/smartphone-based AR applications that project extensive information, visual 3D structures, and links onto the traditional pages of anatomy textbooks. Recently, other hardware platforms, such as the Hololens glasses by Microsoft, started to support relevant applications [19]. An application allows the interactive, 3D exploration of the human brain, reconstructed with MRI data. Hand gestures allow the deformation, fly-by view, and other interactions with the 3D model to reveal hidden organs [10].

4.2. Surgery

VR and AR find most of their applications in surgery and particularly in laparoscopic surgery training. Endoscopic procedures and neurosurgery are also popular. Fidelity and realism are extremely important factors in these fields, not only for the comfort of the user but also for guaranteeing that no improper handling will be learnt and that the actual stressful conditions will be recreated. VR/AR simulators demonstrate certain advantages, such the minimal cost per use, the absence of ethical issues, and the safety compared to training on actual patients. Moreover, greater diversity and complexity in the procedures can be achieved. However, the cost of purchasing and maintaining is high, and also despite the rapid advances in VR/AR technologies, the realism still fails in representing the operating room settings with high fidelity [17].

4.2.1. Laparoscopic surgery

The current VR laparoscopic simulators are characterized as “hybrid” as they have real instruments and a virtual operating field, projected on a screen or other display devices. They provide haptic feedback to users according to their handlings, and most of them have assessment systems that measure parameters such as the time to complete a task, the errors made during surgery, and the surgeon’s economy of movements. They can have both “task trainer” modules, for training in more simple and abstract handlings, and also “complete operation” modules that simulate the whole operating procedure [18]. They seem to benefit the novices, especially when intensive feedback by the instructor is applied [19]. The following simulators are among the most popular:

Minimally Invasive Surgical Trainer-Virtual Reality (MIST-VR, Wolfson Centre and VR Solutions): it provides a 3D environment for practicing abstract tasks and moves with a cylinder and a ball. It can be adjusted to six difficulty levels, and it can teach basic laparoscopic skills like suturing and knot tying. It is a low-fidelity simulator that does not support force-feedback function [18, 20]. It has demonstrated construct validity but controversial transferability of skills [21, 22].

LaparoscopyVR™ (LapVR™, Immersion): this simulator supports haptic hardware for force-feedback function. Studies show that inexperienced surgeons acquire skills that can be transferred in real procedures. LapVR™ system allows both individual and team training and offers training in six basic handlings with adjustable levels of difficulty: camera navigation, using the hook electrode, cutting, clipping, suturing, and knot tying. It also has a module for laparoscopic cholecystectomy with 18 alternative cases and 3 levels of difficulty. It also contains a module for ectopic pregnancy, tubal occlusion, and adnexa’s pathology [2, 20].

LapMentor™ (3D Systems, formerly Simbionix): it targets both novices and experienced surgeons, teaching from basic laparoscopic skills to complete laparoscopic operations. It also has an incorporated assessment system for providing metrics to the user. It is characterized by high fidelity and the ability to provide haptic feedback. The modules include basic laparoscopic skills, suturing, laparoscopic cholecystectomy, ventral hernia, gastric bypass, and gynecology cases. Information on handling tissue, handling tools, time, and movement efficiency are obtained during the simulation and are processed to produce metrics [2, 20]. It has demonstrated construct validity, predictive validity, and transferability of technical skills, especially when combined with supervision by the instructors [21, 23].

LapSim™ (Surgical Science): this system includes basic skill modules, anastomosis and suture scenarios, and laparoscopic cholecystectomy scenarios, and one case is dedicated to gynecology. It is a high-fidelity simulator with the ability of transferring functions between instructors and institutions. It has shown good construct and predictive validity and transferability of some skills [2, 18, 20, 21].

The sustainability of the skills obtained with VR simulators is supported by few studies with small samples and a short follow-up period [24]. Other VR/AR models are SINERGIA, Xitact LS500® (Xitact), and ProMIS® (Haptica) [13, 20, 21].

The main application of AR in laparoscopic surgical training is telementoring: the supervisor can teach the trainee by indicating the proper surgical moves, paths, and handlings on the AR screen. These pieces of information are displayed to the trainee and guide them [25]. However, research has shown that this might also distract the user. A study proposes displaying this information on an additional sub-monitor, rather than on a single head-up display [26].

4.2.2. Robotic surgery

RoSS™ (Simulated Surgical Systems) and DV-Trainer® (Mimic Technologies) are robotic surgery VR trainers that have shown face and content validity. DV-Trainer®, along with SEP Robot (SimSurgery) and da Vinci Skills Simulator™ (dVSS™, Intuitive Surgical), provides metrics about the trainees’ performance with construct validity. RoSS™ is the only one that incorporates whole procedural tasks [27].

As the current robotic surgery systems do not rely on tactile feedback, their simulators are also freed from this need. However, for educational purposes an alteration in color in the surgical field exists, depending on the force that the users apply. Training in robotic surgery with VR simulators seems to provide better and transferable surgical skills, as shown by some studies [28, 29]. Novices are found to be particularly benefited from training in robotic simulators [30].

4.2.3. Open surgery

There is a limited variety in open surgery VR simulators and a lack of validation studies, probably because the complexity and the hands-on nature (without the involvement of machine systems) of open surgery operations do not provide an easily simulated environment. Virtual Reality Educational Surgical Tools (VREST) has developed an open surgery simulator for training and assessment. It consists of two haptic devices with haptic feedback enabled and a system that represents 3D images. The software allows the optimization of the scenarios from the teacher. The trainees decide what instruments to use and are assessed by the machine. The first module built on the VREST platform was an inguinal hernia repair according to Lichtenstein. Another similar simulator was developed by the Imperial College London, which has shown construct validity [31].

4.3. Endoscopic procedures

4.3.1. Oesophagogastroduodenoscopy (OGD)

GI Mentor™ (3D Systems, formerly Simbionix) is the only validated VR simulator for upper endoscopy training. Although beneficial for novices, it is of limited benefit for the experienced users. Procedure time, time to reach specific landmarks, intubation time, movement techniques, procedural success rates, and other patient-related outcomes are evaluated in metrics. Studies show that trainees have a significantly lower overall procedure time and a significantly improved technical accuracy, compared to controls [32].

4.3.2. Colonoscopy

More VR simulators are available for colonoscopy training. The evidence that supports the use of VR trainers in the early stages of training is well established. The repetitions required to reach the plateau of VR training are 7–10, depending on the study. After this checkpoint there must be a switch to traditional endoscopy training. The combination of these two methods has been found more beneficial than either of the two alone. The skills acquired with VR training were maintained for at least 9 months [33]. GI Mentor™, EndoVR™ (CAE Healthcare, the old AccuTouch®, Immersion) and Olympus Endo TS-1 (Olympus) are among the simulators that have been studied and have showed construct and predictive validity [21, 33]. GI Mentor™ has been chosen by SAGES as the platform for the Fundamentals of Endoscopic Surgery™ (FES™) examination [34].

4.3.3. Endoscopic retrograde cholangiopancreatography (ERCP)

GI Mentor™ (3D Systems, formerly Simbionix) is the only validated simulator for ERCP among six devices in total. It has been found to be beneficial in acquiring transferable skills and to have the potential to be included in training programs, although it lacks in terms of realism, usefulness, and performance compared to bench model trainers [32, 35]. EndoVR™ is another popular simulator with ERCP modules available.

4.4. Neurosurgery

Neurosurgery training is a field where VR and AR simulators are usually found. Such simulators must be comfortable enough to wear for many hours; extremely realistic, so that they represent effectively the detailed neurosurgery structures; and affordable [36]. They demonstrate several advantages when compared to physical models (animal or bench models): they have a lower cost in use, there is no limit in the repetitions, and they offer a great sense of variety and diversity in the simulated cases. However, the resolution and realism of the constructed world are still issues to be resolved [37].

NeuroVR™ (CAE Healthcare): a neurosurgical training platform that simulates microdissections, tumor aspiration, and hemostasis. It was first introduced as NeuroTouch, a platform designed from over 50 experts from the National Research Council Canada [18]. It provides complex computer-generated metrics in 13 categories; thus it is suitable for use as assessment tool. As there seems to be a plateau in acquired skills, the educational benefit may be limited to early learners. Other specialized variations of the simulator also exist, such as the NeuroTouch Cranio, a simulator for brain tumor resection. The trainer provides haptic feedback and measures the trainee’s performance [37].

ImmersiveTouch® (ImmersiveTouch) is an AR simulator used for training in many surgical disciplines, including spine surgery, neurosurgery, ENT surgery, and ophthalmology [13]. It consists of two physical instruments, which can simulate various surgical instruments, and a system that projects the virtual surgical field onto the hands and the instruments of the user in an interactive way. It was found a valid training method for thoracic pedicle screw placement (face and predictive validity) and clipping aneurysms (face validity) but also for percutaneous trigeminal rhizotomy (construct validity) [13, 38]. Moreover, it had positive learning effect and can accelerate the learning curve in many neurosurgical procedures, such as ventriculostomy, vertebroplasty, and finding the anatomical landmarks for various interventional techniques [39, 40].

Various other VR/AR simulators exist. RoboSim, Vascular Intervention Simulation Trainer® (VIST®, Mentice), EasyGuide Neuro (Philips Medical Systems), ANGIO Mentor™ (3D Systems, formerly Simbionix), VIVENDI (University of Tübingen, Germany), Dextroscope® (Bracco AMT), and Anatomical Simulator for Pediatric Neurosurgery (ASPN, Pro Delphus) are commercially or academically developed systems that simulate solely neurosurgical procedures or incorporate also other additional modules [38, 41, 42].

4.5. Interventional cardiology and cardiothoracic surgery

ANGIO Mentor™ (3D Systems, formerly Simbionix): it provides a virtual field created from CT/MRI images of real patients and data from cardiac intervention procedures (balloon angioplasty, coronary stenting). Standard guidewires, catheters, balloons, stents, and similar devices are used with the system. Monitoring the patient’s vital signs and electrocardiographic changes is available. ANGIO Mentor™ is a robust simulator used in many endovascular procedures, such as carotid stenting and renal, iliac, and other vascular interventions [43].

Vascular Intervention Simulation Trainer® (VIST®, Mentice): it is similar to the ANGIO Mentor™ containing a library of virtual patients, tools, and procedures. These support tactile feedback and generate fluoroscopic images. It also supports a variety of other endovascular procedures, along with an electrophysiology module for training in pacemaker lead placement [43].

Vimedix™ (CAE Healthcare): it is an AR system for training in echocardiography. It consists of a mannequin and a transducer transthoracic or transesophageal echocardiogram. It has displayed face validity. Recently, CAE Healthcare has equipped it with Hololens in order to provide the ability to view the images with glasses, unrestricted from the dimensions of the screen [13]. EchoCom is another echocardiography AR trainer but for neonates. It has shown face, content, and construct validity [13].

Nakao Cardiac Model is a VR simulator for training in surgical palpitation of the beating heart. It supports haptic interaction. It does not include metrics tools and does not offer variability. The Minimally Invasive Cardiac Surgery Simulator by Berlage and colleagues is another VR simulator that virtually projects beating heart, ribs, and chest surface based on the patient’s unique CT images. It is not interactive but supports variability of the simulated environments. A VR Lobectomy Simulator was developed by Solomon and colleagues. It includes video-assisted thoracoscopic surgery resection, lobectomy; it is controlled with haptic stimuli, and it has incorporated metrics tools that track performance, measure surgical times and errors, and ask questions [44].

dVSS™: cardiothoracic surgery also employs VR robotic training. A study found that the group previously trained in dVSS™ performed better in robotic internal thoracic artery harvest and mitral valve annuloplasty than the control group [45].

4.6. Urology

URO Mentor™ (3D Systems, formerly Simbionix): a VR simulator for training in many urologic fields. It has shown educational benefit in cystoscopy and ureteroscopy [46]. It also supports several other modules, including essential skills, stone extraction, stone lithotripsy, cutting strictures, and taking biopsy modules. Face and content validity have been confirmed, and it is considered as a realistic and useful training model for endourological procedures.

University of Washington TURP Trainer (University of Washington): this VR trainer was designed for training in transurethral resection of the prostate (TURP). It has been extensively validated and shown face and construct validity. This is the only model to have shown concurrent validity with other assessment tools in studies with large numbers of participants [47, 48].

UROSim™ (VirtaMed): a commercial TURP VR simulator that has recurrently shown face, content, and construct validity. It simulates endoscopic movement and allows users to perform resection, coagulation, and also complete TURP operations. It can be adjusted to different levels of difficulty and has advanced metrics tools [47, 48].

Other TURP simulators using VR are the PelvicVisionTURP simulator, the GreenLight laser simulator (GreenLight SIM, American Medical Systems Inc.), and the Kansai HoLEP simulator. All these demonstrate face and construct validity [47, 48].

A module employing Phantom® haptics interface (Geomagic) was developed for digital rectal examination. However further research is required. The University of Grenoble developed in 2013 a VR simulator for transrectal ultrasound-guided prostate biopsy. Since then, several systems have been constructed [10]. Moreover, the ProMIS® laparoscopic VR trainer has displayed construct validity in urologic procedures [49]. Studies have compared the educational effect of VR trainers with that of bench models and found no difference in the early steps. However, VR trainers are far more expensive than bench models but also offer greater diversity [50]. AR has limited applications in urology training, as it is only used to teach theoretically the procedural steps. A relevant application based on Google Glass trains urology residents in how to place an inflatable penile prosthesis [51].

4.7. Orthopedics

The ImmersiveTouch® simulator is used in a variety of spine surgery procedures, including thoracic pedicle screw placement and percutaneous spinal needle placement. Phantom® haptics interface is used for spinal needle insertion. Gaumard HAL S2001® and HAL S3000® Mannequins and Novint Falcon® (Novint Technologies) are systems for vertebroplasty simulation training. A model from Medtronic has also been used in pedicle screw placement and placement of lateral mass screws [52].

Three VR models are validated and commercially available for training in arthroscopy. Both Arthro MENTOR™ (3D Systems, formerly Simbionix) which was the evolution of ARTHRO-VR®(GMV) and ArthroSIM have displayed face, content, and construct validity [53, 54, 55, 56]. ArthroS™ (Virtamed) has demonstrated greater face validity than Arthro MENTOR™ and ArthroSIM in a head-to-head comparison study [54].

4.8. Endovascular surgery

Usually, endovascular VR trainers have metric tools that evaluate the procedural time, the volume of the contrast used, and the time for fluoroscopy. They provide plenty of modules including angioplasty and stenting of coronary, renal, iliac, and femoral vessels. They also seem to benefit the inexperienced users, and the skills acquired seem to be transferable [57].

ANGIO Mentor™ (3D Systems, formerly Simbionix): it contains 23 different endovascular procedures and over 150 patient scenarios. It simulates the environment of endovascular procedures performed under fluoroscopy in the cath lab, interventional suite, or operating room. It supports haptic feedback, and it can benefit trainees of different disciplines, including interventional cardiology, interventional radiology, vascular surgery, electrophysiology, neuro surgery, and interventional neuroradiology and trauma. It has shown better procedure time, fluoroscopy time, and contrast volume in many procedures, such as carotid artery stenting, renal artery stenting, and cerebral artery angioplasty [42, 58].

Vascular Intervention Simulation Trainer® (VIST®, Mentice): as already mentioned it contains many training modules. It also supports haptic feedback and has many difficulty levels. Studies have shown improvement of users’ performance in many endovascular procedures, including carotid artery, cerebral artery, renal artery, iliofemoral artery, aorta, and superficial femoral artery procedures [42, 58].

Cardio CT (Shina System), SimSuite (Medical Simulation Corporation), and the newer Compass™ 2 (Medical Simulation Corporation) are other VR simulators that have not been so extensively studied [58].

4.9. Obstetrics and gynecology

HystSim™ and the newer EssureSim™ (both by VirtaMed) are VR simulators for training in hysteroscopic procedures. Studies have attributed face, construct, and predictive validity to them and have also shown that the skills taught are retained for at least 2 weeks [59, 60]. A model from CAE Healthcare (the old AccuTouch, Immersion) has been used in hysteroscopy training and has shown construct validity and the ability to improve performance equally with box trainers [59].

MIST-VR might be suitable for training in gynecology training, as it has shown positive results in other fields [22]. LapSim™ has also been used in salpingectomy training and shown construct validity [61]. AR has nonsignificant applications in gynecology training.

4.10. ENT

OtoSim™ (OtoSim Inc.): it is used in otoscopy training, offering a wide range of realistic images of the tympanic membrane and of various middle ear lesions. It was found to have good face validity [62].

VOXEL-MAN surgical simulator (Hamburg, Germany): it is a computer-based interface that can be used for temporal bone drilling and mastoidectomy. It supports Phantom® haptic devices for tactile feedback and 3D visual interfaces. It has demonstrated face, content, and construct validity [62, 63, 64]. Other similar simulators that display validity are the Ohio State University surgical simulator, the Stanford Surgical Simulator (Stanford BioRobotics lab), and the Mediseus surgical simulator (CSIRO/University of Melbourne Temporal Bone Simulator) [62, 63, 64].

Endoscopic Sinus Surgery Simulator (ES3, Lockheed Martin): it is used in sinus surgery, enabling haptic feedback with Phantom®. It collects and analyzes performance data providing performance metrics. It consists of three components: a workstation simulation platform (Silicon Graphics Inc.), a haptic controller, and an electromechanical human interaction platform with replica of an endoscope, surgical tools, and a mannequin. It is among the most widely validated sinus simulators, demonstrating content, concurrent, construct, and predictive validity and a significant retention of the acquired skills [62, 64]. Dextroscope® is also used in sinus surgery and has shown predictive validity [64].

dVSS™ is also used in robotic ENT surgery training. Tonsillectomy, fine-needle biopsy of the thyroid, myringotomy, nasal endoscopy, and cricothyroidotomy VR simulators have also been developed [62, 65].

4.11. Ophthalmology

Cataract and vitreoretinal surgery is the main field of VR training in ophthalmology. EyeSi® simulator (VRmagic) is the most commonly used system. It includes two pedals (one for the microscope and one for phacoemulsification/vitrectomy/infusion and aspiration modes), a physical head and eye, and a microscope, all connected to a computer. It does not support tactile feedback. However, it demonstrates construct and predictive validity. The educational effect is restricted to novices [66, 67, 68]. MicrovisTouch™ (ImmersiveTouch) and PhacoVision® (Melerit Medical) are additional models with only the MicrovisTouch™ supporting haptic feedback [66]. AR is still of limited use, although a study finds educational benefit when AR is applied in ophthalmoscopy teaching [69].

4.12. Other fields

Pediatrics and emergency medicine are promising fields for integration of VR training. Rigorous work is being done to create realistic virtual humans and virtual pediatric models. Moreover, highly realistic virtual environments that recreate massive destruction situations are being developed for use in emergency medicine [70, 71].

AR has been used in intubation training. More modern systems, such as BRONCH Mentor™ (3D Systems, formerly Simbionix) and EndoVR Simulator™, include intubation modules and have shown positive educational effect [72].

Ultrasonography simulators have been used for assessing trainees. ScanTrainer® (MedaPhor). It consists of a haptic force-feedback-enabled device and a computer with a touchscreen that visualizes ultrasonography images according to the user’s movements [73].

EndoVR Simulator™, BRONCH Mentor™, and ORSIM® (Airway Simulation Limited) are some of the systems used for training in bronchoscopy [62]. Most of them have shown positive learning effects. BRONCH Mentor™ offers a bigger variety of pathologies and clinical situations and a more realistic environment. ORSIM® is a small, portable simulator with a physical bronchoscope and an interface based on regular laptops [74].

Some AR applications that indicate the landmarks for central venous catheterization have been developed. An ultrasound probe locates the vein and a microprojector projects it onto the skin. A study has found positive learning effect, when used in landmark finding training [75].

Advertisement

5. Conclusion and future perspectives

Most of the educational applications of VR and AR seem to have construct and predictive validity, with the acquired skills to be transferable to real situations. However, the credibility of several of these studied might be questionable. First of all, many of them are not randomized studies, with cohorts of different characteristics and inadequate number of participants (less than 30 in most studies). Moreover, there are few studies for each simulator, and there are no similar standards in their design, so that they could be summarized and directly compared. In addition, many of the studies referring to specific simulators become quickly outdated as they do not take into account the simulators’ continuous upgrades. More randomized control trials, comparing the effect of VR training against no training, other simulation-based training, or different VR training systems, are needed. The samples must be larger to strengthen the results and the designs of different studies similar [5, 7, 8, 21]. Moreover, the extent of the decay of the skills over time must be elucidated. When these properties, along with the cost factors, are clarified, then we can examine the way that VR and AR can be officially incorporated in medical education curricula.

However, the published literature suggests a positive educational impact. VR/AR training displays certain advantages toward other simulation techniques. Although expensive to buy, VR/AR simulators provide a relatively costless opportunity for reproducible training under various environments and difficulty levels. Moreover, they do not raise ethical issues, compared with other animal and living tissue simulation models. They provide immersion for the user and the ability to perform complete procedures, in contrast with partial task trainers. Multiple studies have shown a favorable impact of VR/AR trainers on inexperienced trainees, and we can intuitively assume that they are technically evolving in high pace as the technology progresses. Future improvements could include the integration of olfactory stimuli. Odors can be used as diagnostic tools or even to recreate stressful conditions (e.g., in a combat or in the operating room) with greater realism [1, 8]. Medical informatics is also an evolving field. Medical data will be visualized more clearly and impressively with VR/AR technology. Users will be able to dive into statistical plots and reports, watch them in 4D (in 3D space and time), manipulate them, and “wander” around them. Although significant progress has been made, there is still a need for more processing power, higher resolution, better design of the scenarios, and more advanced haptic devices in order to achieve highly realistic environments [31, 76].

References

  1. 1. Beolchi L, Riva G. Virtual reality for health care. In: Akay M, Marsh A, editors. Information Technologies in Medicine. Vol. 2. New York, USA: John Wiley & Sons, Inc.; 2001. pp. 39-83. DOI: 10.1002/0471206458.ch3
  2. 2. Graur F. Virtual reality in medicine — Going beyond the limits. In: Sik-Lanyi C, editor. Thousand Faces Virtual Real. Rijeka, Croatia: InTech; 2014. pp. 23-35. DOI: 10.5772/59277
  3. 3. Satava RM. Virtual reality surgical simulator. Surgical Endoscopy. 1993;7:203-205
  4. 4. Langreth R. Virtual reality: Head mounted distress. Popular Science. 1994;5:49
  5. 5. Vozenilek J. See one, do one, teach one: Advanced technology in medical education. Academic Emergency Medicine. 2004;11:1149-1154. DOI: 10.1197/j.aem.2004.08.003
  6. 6. Riva G. Applications of virtual environments in medicine. Methods of Information in Medicine. 2003;42:524-534
  7. 7. Kamphuis C, Barsom E, Schijven M, Christoph N. Augmented reality in medical education? Perspectives on Medical Education. 2014;3:300-311. DOI: 10.1007/s40037-013-0107-7
  8. 8. Sakakushev BE, Marinov BI, Stefanova PP, Kostianev SS, Georgiou EK. Striving for better medical education: The simulation approach. Folia Medica (Plovdiv). 2017;59:123-131 DOI: 10.1515/folmed-2017-0039
  9. 9. Hodges LF, Kooper R, Meyer TC, Rothbaum BO, Opdyke D, de Graaff JJ, et al. Virtual environments for treating the fear of heights. Computer. 1995;28:27-34. DOI: 10.1109/2.391038
  10. 10. Hamacher A, Kim SJ, Cho ST, Pardeshi S, Lee SH, Eun S-J, et al. Application of virtual, augmented, and mixed reality to urology. International Neurourology Journal. 2016;20:172-181. DOI: 10.5213/inj.1632714.357
  11. 11. ISO 9241-210:2010. Ergonomics of human-system interaction – Part 210: Human-centred design for interactive systems. Available from: https://www.iso.org/standard/52075.html
  12. 12. Persson J. A review of the design and development processes of simulation for training in healthcare – A technology-centered versus a human-centered perspective. Applied Ergonomics. 2017;58:314-326. DOI: 10.1016/j.apergo.2016.07.007
  13. 13. Barsom EZ, Graafland M, Schijven MP. Systematic review on the effectiveness of augmented reality applications in medical training. Surgical Endoscopy. 2016;30:4174-4183. DOI: 10.1007/s00464-016-4800-6
  14. 14. Wang LL, Wu H-H, Bilici N, Tenney-Soeiro R. Gunner goggles: Implementing augmented reality into medical education. Studies in Health Technology and Informatics. 2016;220:446-449
  15. 15. Albrecht U-V, Noll C, von Jan U. Explore and experience: Mobile augmented reality for medical training. Studies in Health Technology and Informatics. 2013;192:382-386
  16. 16. Trelease RB. Anatomical informatics: Millennial perspectives on a newer frontier. The Anatomical Record. 2002;269:224-235. DOI: 10.1002/ar.10177
  17. 17. Evans CH, Schenarts KD. Evolving educational techniques in surgical training. The Surgical Clinics of North America. 2016;96:71-88. DOI: 10.1016/j.suc.2015.09.005
  18. 18. Badash I, Burtt K, Solorzano CA, Carey JN. Innovations in surgery simulation: A review of past, current and future techniques. Annals of Translational Medicine. 2016;4:453-453. DOI: 10.21037/atm.2016.12.24
  19. 19. Paschold M, Huber T, Zeißig SR, Lang H, Kneist W. Tailored instructor feedback leads to more effective virtual-reality laparoscopic training. Surgical Endoscopy. 2014;28:967-973. DOI: 10.1007/s00464-013-3258-z
  20. 20. Larsen CR, Oestergaard J, Ottesen BS, Soerensen JL. The efficacy of virtual reality simulation training in laparoscopy: A systematic review of randomized trials: Virtual reality simulation: A review. Acta Obstetricia et Gynecologica Scandinavica. 2012;91:1015-1028. DOI: 10.1111/j.1600-0412.2012.01482.x
  21. 21. Beyer-Berjot L, Aggarwal R. Toward technology-supported surgical training: The potential of virtual simulators in laparoscopic surgery. Scandinavian Journal of Surgery. 2013;102:221-226. DOI: 10.1177/1457496913496494
  22. 22. Hammond I, Karthigasu K. Training, assessment and competency in gynaecologic surgery. Best Practice & Research. Clinical Obstetrics & Gynaecology. 2006;20:173-187. DOI: 10.1016/j.bpobgyn.2005.09.006
  23. 23. Cole SJ, Mackenzie H, Ha J, Hanna GB, Miskovic D. Randomized controlled trial on the effect of coaching in simulated laparoscopic training. Surgical Endoscopy. 2014;28:979-986. DOI: 10.1007/s00464-013-3265-0
  24. 24. Khan MW, Lin D, Marlow N, Altree M, Babidge W, Field J, et al. Laparoscopic skills maintenance: A randomized trial of virtual reality and box trainer simulators. Journal of Surgical Education. 2014;71:79-84. DOI: 10.1016/j.jsurg.2013.05.009
  25. 25. Vera AM, Russo M, Mohsin A, Tsuda S. Augmented reality telementoring (ART) platform: A randomized controlled trial to assess the efficacy of a new surgical education technology. Surgical Endoscopy. 2014;28:3467-3472. DOI: 10.1007/s00464-014-3625-4
  26. 26. Dixon BJ, Daly MJ, Chan HHL, Vescan A, Witterick IJ, Irish JC. Inattentional blindness increased with augmented reality surgical navigation. American Journal of Rhinology & Allergy. 2014;28:433-437. DOI: 10.2500/ajra.2014.28.4067
  27. 27. Lallas CD. Davis and Members of the society of JW. Robotic surgery training with commercially available simulation systems in 2011: A current review and practice pattern survey from the society of urologic robotic surgeons. Journal of Endourology. 2012;26:283-293. DOI: 10.1089/end.2011.0371
  28. 28. Kiely DJ, Gotlieb WH, Lau S, Zeng X, Samouelian V, Ramanakumar AV, et al. Virtual reality robotic surgery simulation curriculum to teach robotic suturing: A randomized controlled trial. Journal of Robotic Surgery. 2015;9:179-186. DOI: 10.1007/s11701-015-0513-4
  29. 29. Vaccaro CM, Crisp CC, Fellner AN, Jackson C, Kleeman SD, Pavelka J. Robotic virtual reality simulation plus standard robotic orientation versus standard robotic orientation alone: A randomized controlled trial. Female Pelvic Medicine & Reconstructive Surgery. 2013;19:266-270. DOI: 10.1097/SPV.0b013e3182a09101
  30. 30. Liu M, Curet M. A review of training research and virtual reality simulators for the da Vinci surgical system. Teaching and Learning in Medicine. 2015;27:12-26. DOI: 10.1080/10401334.2014.979181
  31. 31. Davies J, Khatib M, Bello F. Open surgical simulation—A review. Journal of Surgical Education. 2013;70:618-627. DOI: 10.1016/j.jsurg.2013.04.007
  32. 32. van der Wiel SE, Küttner Magalhães R, Rocha Gonçalves CR, Dinis-Ribeiro M, Bruno MJ, Koch AD. Simulator training in gastrointestinal endoscopy – From basic training to advanced endoscopic procedures. Best Practice & Research. Clinical Gastroenterology. 2016;30:375-387. DOI: 10.1016/j.bpg.2016.04.004
  33. 33. Harpham-Lockyer L. Role of virtual reality simulation in endoscopy training. World Journal of Gastrointestinal Endoscopy. 2015;7:1287. DOI: 10.4253/wjge.v7.i18.1287
  34. 34. Hashimoto DA, Petrusa E, Phitayakorn R, Valle C, Casey B, Gee D. A proficiency-based virtual reality endoscopy curriculum improves performance on the fundamentals of endoscopic surgery examination. Surgical Endoscopy. DOI: 10.1007/s00464-017-5821-5
  35. 35. Ekkelenkamp VE, Koch AD, de Man RA, Kuipers EJ. Training and competence assessment in GI endoscopy: A systematic review. Gut. 2016;65:607-615. DOI: 10.1136/gutjnl-2014-307173
  36. 36. Pelargos PE, Nagasawa DT, Lagman C, Tenn S, Demos JV, Lee SJ, et al. Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery. Journal of Clinical Neuroscience. 2017;35:1-4. DOI: 10.1016/j.jocn.2016.09.002
  37. 37. Konakondla S, Fong R, Schirmer CM. Simulation training in neurosurgery: Advances in education and practice. Advances in Medical Education and Practice. 2017;8:465-473. DOI: 10.2147/AMEP.S113565
  38. 38. Rehder R, Abd-El-Barr M, Hooten K, Weinstock P, Madsen JR, Cohen AR. The role of simulation in neurosurgery. Child’s Nervous System. 2016;32:43-54. DOI: 10.1007/s00381-015-2923-z
  39. 39. Yudkowsky R, Luciano C, Banerjee P, Schwartz A, Alaraj A, Lemole GMJ, et al. Practice on an augmented reality/haptic simulator and library of virtual brains improves residents’ ability to perform a ventriculostomy. Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare. 2013;8:25-31. DOI: 10.1097/SIH.0b013e3182662c69
  40. 40. Alaraj A, Charbel FT, Birk D, Tobin M, Luciano C, Banerjee PP, et al. Role of cranial and spinal virtual and augmented reality simulation using immersive touch modules in neurosurgical training. Neurosurgery. 2013;72(Suppl 1):115-123. DOI: 10.1227/NEU.0b013e3182753093
  41. 41. Kirkman MA, Ahmed M, Albert AF, Wilson MH, Nandi D, Sevdalis N. The use of simulation in neurosurgical education and training: A systematic review. Journal of Neurosurgery. 2014;121:228-246. DOI: 10.3171/2014.5.JNS131766
  42. 42. Mitha AP, Almekhlafi MA, Janjua MJJ, Albuquerque FC, McDougall CG. Simulation and augmented reality in endovascular neurosurgery: Lessons from aviation. Neurosurgery. 2013;72(Suppl 1):107-114. DOI: 10.1227/NEU.0b013e31827981fd
  43. 43. Lake CL. Simulation in cardiology and cardiothoracic and vascular surgery. Seminars in Cardiothoracic and Vascular Anesthesia. 2005;9:325-333-1. DOI: 10.1177/108925320500900405
  44. 44. Trehan K, Kemp CD, Yang SC. Simulation in cardiothoracic surgical training: Where do we stand? The Journal of Thoracic and Cardiovascular Surgery. 2014;147:18-24.e2. DOI: 10.1016/j.jtcvs.2013.09.007
  45. 45. Valdis M, Chu MWA, Schlachta CM, Kiaii B. Validation of a novel virtual reality training curriculum for robotic cardiac surgery: A randomized trial. Innovations: Technology and Techniques in Cardiothoracic and Vascular Surgery. 2015;10:383-388. DOI: 10.1097/IMI.0000000000000222
  46. 46. Watterson JD, Beiko DT, Kuan JK, Denstedt JD. Randomized prospective blinded study validating acquisition of ureteroscopy skills using computer based virtual reality endourological simulator. The Journal of Urology. 2002;168:1928-1932. DOI: 10.1097/01.ju.0000034357.84449.56
  47. 47. Viswaroop SB, Gopalakrishnan G, Kandasami SV. Role of transurethral resection of the prostate simulators for training in transurethral surgery. Current Opinion in Urology. 2015;25:153-157. DOI: 10.1097/MOU.0000000000000141
  48. 48. Khan R, Aydin A, Khan MS, Dasgupta P, Ahmed K. Simulation-based training for prostate surgery. BJU International. 2015;116:665-674. DOI: 10.1111/bju.12721
  49. 49. Feifer A, Delisle J, Anidjar M. Hybrid augmented reality simulator: Preliminary construct validation of laparoscopic smoothness in a urology residency program. The Journal of Urology. 2008;180:1455-1459. DOI: 10.1016/j.juro.2008.06.042
  50. 50. Chou DS, Abdelshehid C, Clayman RV, McDougall EM. Comparison of results of virtual-reality simulator and training model for basic Ureteroscopy training. Journal of Endourology. 2006;20:266-271. DOI: 10.1089/end.2006.20.266
  51. 51. Dickey RM, Srikishen N, Lipshultz LI, Spiess PE, Carrion RE, Hakky TS. Augmented reality assisted surgery: A urologic training tool. Asian Journal of Andrology. 2016;18:732-734. DOI: 10.4103/1008-682X.166436
  52. 52. Pfandler M, Lazarovici M, Stefan P, Wucherer P, Weigl M. Virtual reality-based simulators for spine surgery: A systematic review. The Spine Journal. 2017;17:1352-63. DOI: 10.1016/j.spinee.2017.05.016
  53. 53. LeBel M-E, Haverstock J, Cristancho S, van Eimeren L, Buckingham G. Observational learning during simulation-based training in arthroscopy: Is it useful to novices? Journal of Surgical Education. DOI: 10.1016/j.jsurg.2017.06.005
  54. 54. Martin KD, Akoh CC, Amendola A, Phisitkul P. Comparison of three virtual reality arthroscopic simulators as part of an orthopedic residency educational curriculum. The Iowa Orthopaedic Journal. 2016;36:20-25
  55. 55. Bayona S, Fernández-Arroyo JM, Martín I, Bayona P. Assessment study of insight ARTHRO VR® arthroscopy virtual training simulator: Face, content, and construct validities. Journal of Robotic Surgery. 2008;2:151-158. DOI: 10.1007/s11701-008-0101-y
  56. 56. Obdeijn MC, Bavinck N, Mathoulin C, van der Horst CMAM, Schijven MP, Tuijthof GJM. Education in wrist arthroscopy: Past, present and future. Knee Surgery, Sports Traumatology, Arthroscopy. 2015;23:1337-1345. DOI: 10.1007/s00167-013-2592-y
  57. 57. Tsang JS, Naughton PA, Leong S, Hill ADK, Kelly CJ, Leahy AL. Virtual reality simulation in endovascular surgical training. Surgeon. 2008;6:214-220
  58. 58. See KWM, Chui KH, Chan WH, Wong KC, Chan YC. Evidence for endovascular simulation training: A systematic review. European Journal of Vascular and Endovascular Surgery. 2016;51:441-451. DOI: 10.1016/j.ejvs.2015.10.011
  59. 59. Savran MM, Sørensen SMD, Konge L, Tolsgaard MG, Bjerrum F. Training and assessment of hysteroscopic skills: A systematic review. Journal of Surgical Education. 2016;73:906-918. DOI: 10.1016/j.jsurg.2016.04.006
  60. 60. Janse JA, Goedegebuure RSA, Veersema S, Broekmans FJM, Schreuder HWR. Hysteroscopic sterilization using a virtual reality simulator: Assessment of learning curve. Journal of Minimally Invasive Gynecology. 2013;20:775-782. DOI: 10.1016/j.jmig.2013.04.016
  61. 61. Aggarwal R, Tully A, Grantcharov T, Larsen C, Miskry T, Farthing A, et al. Virtual reality simulation training can improve technical skills during laparoscopic salpingectomy for ectopic pregnancy. BJOG: An International Journal of Obstetrics & Gynaecology. 2006;113:1382-1387. DOI: 10.1111/j.1471-0528.2006.01148.x
  62. 62. Javia L, Sardesai MG. Physical models and virtual reality simulators in otolaryngology. Otolaryngologic Clinics of North America. 2017;50:875-891. DOI: 10.1016/j.otc.2017.05.001
  63. 63. Andersen SAW. Virtual reality simulation training of mastoidectomy – Studies on novice performance. Danish Medical Journal. 2016;63:B5277
  64. 64. Arora A, Lau LYM, Awad Z, Darzi A, Singh A, Tolley N. Virtual reality simulation training in otolaryngology. International Journal of Surgery. 2014;12:87-94. DOI: 10.1016/j.ijsu.2013.11.007
  65. 65. Burns JA, Adkins LK, Dailey S, Klein AM. Simulators for laryngeal and airway surgery. Otolaryngologic Clinics of North America. 2017;50:903-922. DOI: 10.1016/j.otc.2017.05.003
  66. 66. Sikder S, Tuwairqi K, Al-Kahtani E, Myers WG, Banerjee P. Surgical simulators in cataract surgery training. The British Journal of Ophthalmology. 2014;98:154-158. DOI: 10.1136/bjophthalmol-2013-303700
  67. 67. Thomsen ASS, Bach-Holm D, Kjærbo H, Højgaard-Olsen K, Subhi Y, Saleh GM, et al. Operating room performance improves after proficiency-based virtual reality cataract surgery training. Ophthalmology. 2017;124:524-531. DOI: 10.1016/j.ophtha.2016.11.015
  68. 68. Gillan SN, Saleh GM. Ophthalmic surgical simulation: A new era. JAMA Ophthalmology. 2013;131:1623. DOI: 10.1001/jamaophthalmol.2013.1011
  69. 69. Leitritz MA, Ziemssen F, Suesskind D, Partsch M, Voykov B, Bartz-Schmidt KU, et al. Critical evaluation of the usability of augmented reality ophthalmoscopy for the training of inexperienced examiners. Retina (Phila, PA). 2014;34:785-791. DOI: 10.1097/IAE.0b013e3182a2e75d
  70. 70. Lopreiato JO, Sawyer T. Simulation-based medical education in pediatrics. Academic Pediatrics. 2015;15:134-142. DOI: 10.1016/j.acap.2014.10.010
  71. 71. Reznek M, Harter P, Krummel T. Virtual reality and simulation: Training the future emergency physician. Academic Emergency Medicine: Official Journal of the Society for Academic Emergency Medicine. 2002;9:78-87
  72. 72. Nilsson PM, Russell L, Ringsted C, Hertz P, Konge L. Simulation-based training in flexible fibreoptic intubation: A randomised study. European Journal of Anaesthesiology. 2015;32:609-614. DOI: 10.1097/EJA.0000000000000092
  73. 73. Jensen JK, Dyre L, Jørgensen ME, Andreasen LA, Tolsgaard MG. Collecting validity evidence for simulation-based assessment of point-of-care ultrasound skills: Simulation-based ultrasound assessment. Journal of Ultrasound in Medicine. 2017;36:2475-2483. DOI: 10.1002/jum.14292
  74. 74. Naur TMH, Nilsson PM, Pietersen PI, Clementsen PF, Konge L. Simulation-based training in flexible bronchoscopy and Endobronchial ultrasound-guided Transbronchial needle aspiration (EBUS-TBNA): A systematic review. Respiration. 2017;93:355-362. DOI: 10.1159/000464331
  75. 75. Jeon Y, Choi S, Kim H. Evaluation of a simplified augmented reality device for ultrasound-guided vascular access in a vascular phantom. Journal of Clinical Anesthesia. 2014;26:485-489. DOI: 10.1016/j.jclinane.2014.02.010
  76. 76. Kapoor S. Haptics – Touchfeedback technology widening the horizon of medicine. Journal of Clinical and Diagnostic Research. 2014;8:294-299. DOI: 10.7860/JCDR/2014/7814.4191

Written By

Panteleimon Pantelidis, Angeliki Chorti, Ioanna Papagiouvanni, Georgios Paparoidamis, Christos Drosos, Thrasyvoulos Panagiotakopoulos, Georgios Lales and Michail Sideris

Submitted: 21 May 2017 Reviewed: 26 October 2017 Published: 20 December 2017