Open access peer-reviewed chapter

Improving Medical Simulation Using Virtual Reality Augmented by Haptic Proxy

Written By

Pierre Boulanger, Thea Wang and Mahdi Rahmani Hanzaki

Submitted: 15 August 2022 Reviewed: 27 September 2022 Published: 03 November 2022

DOI: 10.5772/intechopen.108330

From the Edited Volume

Modern Development and Challenges in Virtual Reality

Edited by Mamata Rath and Tushar Kanta Samal

Chapter metrics overview

133 Chapter Downloads

View Full Metrics

Abstract

This chapter explores how the realism of haptic perception in virtual reality can be significantly enhanced with the help of the concept of haptic proxy. In haptic proxy, the position and orientation of physical objects are tracked in real-time and registered to their virtual counterparts. A compelling sense of tactile immersion can be achieved if the tracked objects have similar tactile properties to their virtual counterpart. A haptic proxy prototype was developed, and a pilot study was conducted to determine if the haptic proxy system is more credible than standard virtual reality. To test our prototype, we performed simple medical tasks such as moving a patient’s arm and aiming a syringe to specific locations. Our results suggest that simulation using a haptic proxy system is more believable and user-friendly and can be extended to developing new generations of open surgery simulators.

Keywords

  • proxy haptic
  • medical trainer
  • virtual reality
  • real-time tracking
  • usability study

1. Introduction

In the last few years, a new generation of low-cost virtual reality (VR) systems have emerged where users can explore or interact with an artificial world in real time. Modern virtual reality systems can create multi-sensory outputs (vision, haptic, and sound), providing the illusion of real-world perception. Commercial VR systems use head-mounted displays (HMD) to monitor head movement and update in real-time the visual display mounted inside the HMD. By wearing an HMD, users can look and move around an environment and, in some cases, interact with objects with the help of two virtual reality wands (one for each hand). Developers can create these immersive worlds using game engines like Unity3D (https://unity.com/) and Unreal Engine (https://www.unrealengine.com/). Current VR technologies are used in many applications such as training, medicine, 3D cinema, video games, etc. This chapter will focus on using VR augmented by haptic proxy to develop an open surgical training system.

1.1 VR for surgical training

Efficient and safe surgical training has always been essential to the surgeon’s qualification. To become a general surgeon in Canada, students must complete an undergraduate and graduate medical program, followed by a five-year residency program in general surgery. The practical component of the residency program is to observe, assist, and perform surgery in an operating room (OR) under the supervision of a fully trained surgeon. This process is slow and expensive, so developing virtual reality-based surgical simulators to prepare first-time surgeons better is paramount.

According to St John et al., [1] from 2003 to 2018, the use of Minimally Invasive Surgeries (MIS) has increased in many procedures, slowly replacing open surgeries hence the proliferation of commercial VR simulators. In this approach, users are expected to grasp MIS surgical tools attached to robotic arms (typically 2) where forces are applied to the user’s hands depending on the simulated interaction with the virtual patient.

Although laparoscopic techniques are gaining popularity, most procedures still use open surgery techniques. Unfortunately, open surgical simulators are difficult to develop using current haptic technologies as they do not provide sufficient perceptual feedback to be persuasive. This is troublesome since haptic feedback is one of the primary senses surgeons use to perform these tasks. Good haptics rendering for open surgery should help to perceive the surface texture, elasticity, and temperature of objects through the mechanoreceptors in both hands. It should also help detect and manipulate objects without or with reduced visibility (e.g., in the dark).

In this chapter, we will explore how the concept of haptic proxy can be used to help develop a new generation of open surgical simulators.

1.2 Chapter organization

Section 2 will look at advanced medical simulation using VR and haptic. Section 3 describes the proposed enhanced VR system using a haptic proxy and the baseline VR system using no haptic returns. Section 4 describes experimental protocols comparing standard VR with VR augmented with haptic proxy for three simple medical procedures. A statistical analysis of each procedure is presented. We then conclude with Section 5, looking at the potential of the haptic proxy approach to creating a new generation of open surgical simulators.

Advertisement

2. Related work

Many researchers have attempted to develop new ways to make the virtual reality experience more realistic, particularly for surgical training. In conventional VR devices, most systems concentrate on the visual and auditory senses (i.e., they are our dominant senses). Many commercial systems such as Oculus (https:/www.oculus.com) and HTC Vive (https://www.vive.com) have been developed based on those two senses because it is easy to create realistic experiences. Very few commercial systems have used the sense of touch (haptic feedback) to complement vision and audition. Because the sense of touch is crucial to surgical training, three approaches can be found in the literature. They comprise simulators with haptic robots, haptic exoskeletal gloves, and haptic proxies.

2.1 Simulators using haptic robot

For reasons discussed previously, research and development on surgical training systems using virtual reality simulators have focused primarily on laparoscopic surgeries. Laparoscopic surgery is well suited to the current capabilities of haptic interface technology because the tools can be easily attached to force feedback systems. Current systems are realistic enough to allow for a direct transfer of competencies to the OR. Training in an immersive environment has proven that it can shorten the learning time and reduce adaption time for military and ICU staff [2, 3].

2.1.1 Simulators using haptic robot and 2D display

Let us first reviewed three of the most popular commercial virtual reality training systems for laparoscopic surgery, they are: MIST-VR [4, 5], Haptica ProMIS [6], and LAPSIM [7].

MIST-VR: In the research proposed by Seymour [5], the MIST-VR system included two laparoscopic tools mounted in a frame with six degrees of freedom force feedback, a foot pedal to activate simulated instruments and an accurately scaled 2D display. Research has shown that for learning gall bladder dissection, the learning rate was 29% faster for students trained in MIST-VR than for students who received standard training procedures. In addition, students trained with MIST-VR have made fewer mistakes in injuring the gallbladder or burning non-target tissues.

Haptica ProMIS: Van Sickle [6] did a study to evaluate the training performance of a complex laparoscopic suturing task by using ProMIS augmented-reality simulator. The study carried out by his team aimed to determine whether the use of the ProMIS simulator could distinguish performance between experts and beginners. The system was composed of a torso-shaped mannequin containing three-camera tracking systems to identify the instruments inside the mannequin from different angles. Furthermore, the primary camera was moved to the mannequin’s pubic symphysis to generate the view on the computer screen. Experimental findings indicate that the ProMIS simulator possesses good psychometric properties. The experts did much better than the novices and had more coherent performances using the ProMIS simulator.

LAPSIM: The LapSim system is like MIST-VR, focusing on laparoscopic skills for suturing. With different modules, it simulates various tasks in the abdominal cavity [8]. The work by Duffy [7] was similar to the one by Van Sickle’s research on the ProMIS simulator. The goal was to determine if the LapSim system could distinguish a beginner from an expert. The LapSim system employed a simulated laparoscopic instrument operated by a five-degree force feedback system. Test tasks included laparoscopic camera navigation, tool navigation, coordination, gripping, lifting, gripping, cutting, staple application and suturing. Compared with Van Sickles’ experimental system, this system provided a more vivid and realistic simulation despite the limited degree of freedom of the haptic feedback. Duffy’s study showed that the LapSim system could distinguish between the laparoscopic competencies of experts and beginners. However, the capability to transfer competencies to the operating room has not yet been demonstrated.

2.1.2 Simulators using haptic robot and HMD

Combining traditional surgical simulators with HMD makes it possible to create a highly immersive simulation environment. Huber [9] has developed the first clinical and technical feasibility study using this new configuration of laparoscopy in immersive VR. In the study, LabSim was used as a laparoscopic virtual reality simulator for “peg transfer,” “fine dissection,” and “cholecystectomy” tasks combining navigation maneuvers, fine preparation, and procedural aspects. Furthermore, the system used an OR model to situate the procedure in context.

More recently, Pulijala [10, 11] developed a system based on Oculus Rift and the Leap Motion devices https://www.ultraleap.com/ for training in maxillofacial surgeries. The teaching environment of Osteotomy LeFort I was used to assessing the impact of immersive virtual reality training on the confidence and knowledge of surgical residents. The actual OR environment of the system gives the simulation a sense of presence [12] and allows trainees to have a three-dimensional interaction with virtual objects. Another recent work by Ganni [13] on minimally invasive surgery training using virtual reality also proved that the adaption period was shortened with training in an immersive environment, and junior medical residents were better prepared to work in the OR.

2.2 Simulators using haptic exoskeleton gloves

A wearable haptic device can give users more precise feedback on palms and fingers and can be used to develop open surgical simulators. In theory, haptic gloves should provide users with haptic experiences through kinesthetic and tactile feedback [14]. However, subtle tactile feedback is still more challenging to achieve with the current technology. Therefore, making a system that will allow the user to “feel” the subtle texture details on an object’s surface is not available. A recent review of exoskeleton actuation technology can be found in Tiboni et al. [15].

Pacchierotti [16] conducted a survey focusing on the fingertip and hand of the wearable haptic device. Their study provided a good definition of wearability by their form factor, weight, shape, area of interest, and ergonomics. The ability to feel softness and hardness is essential for open surgery simulations and can be simulated by simply adopting cutaneous information [17]. The friction on the finger pads is very complicated. The friction demonstrates different properties when the finger pads are dry and when the finger pads are wet (sweats) [18]. When holding a surgical tool during surgery, subtle pressure changes can be felt on the hand’s fingertips. A small pressure sensor may be installed on the haptic glove to detect those changes on the finger or hand to record the indentation by simultaneously detecting the lateral and normal forces.

Based on this specification, the glove by HaptX (https://haptx.com/) using micro-fluidic skin can provide excellent contact haptic, force feedback exoskeleton to control fingers motion, and magnetic motion tracking to track fingers orientations. The tactile actuators displace the skin up to 2 mm to apply physical pressure that can replicate the sensation of real-world objects. Recently Keef et al. [19] developed a haptic glove capable of producing sensations reminiscent of the three types of near-surface properties: hardness, temperature, and roughness. Three haptic actuators are combined to accomplish this mixed mode of stimulation: vibrotactile motors, thermoelectric devices, and electro-tactile electrodes made from a stretchable conductive polymer synthesized in their laboratory. This polymer consists of a stretchable polyanion that serves as a polymerization scaffold. Participants trained to associate these sensations with roughness, hardness, and temperature have an overall accuracy of 98%, whereas untrained participants have an accuracy of 85%. Sensations can similarly be conveyed using a robotic hand equipped with sensors for pressure and temperature.

2.3 Simulators using haptic proxy

Another way to provide haptic feedback is to use a haptic proxy. The work by Oskouie et al. [20] uses a haptic proxy for simple tasks like typing on a virtual keyboard, as illustrated in Figure 1. This study compared two different ways of typing on a virtual keyboard. One uses a VR controller to select letters on a keyboard, while others touch a wall-mounted sponge, acting like a haptic proxy of the keyboard keys. Their work shows that users made fewer mistakes using the haptic proxy approach.

Figure 1.

An example of a prototype virtual keyboard system using a foam board as a haptic proxy [20].

The work of Simeone et al. [21] is another example of using haptic proxy. In this research, they created two virtual worlds, a medieval piece and a spacecraft. They tried to match objects in the virtual world to objects in the physical world with some degree of accuracy. Their results showed that the participants did not report a mismatch if the virtual object they interacted with were close to the physical object. For example, if they were mostly going to interact with a handle on a box, if the handle in the physical world was close to the handle in the virtual world, they did not report a mismatch between those objects, even if those objects were not that similar.

Another paper published by Henderson et al. [22] uses the concept of proxy haptic in an AR environment in which they use objects available in the environment to provide haptic feedback for interacting with the system. For instance, they used the collar of a coil connector to change the value of a virtual text box. Their results showed that users could carry out the tasks of their experiment more quickly than the reference technique, virtual buttons and user interface elements projected on an undifferentiated surface.

Advertisement

3. Comparing standard VR to VR augmented by haptic proxy

To test our assumptions that VR augmented with haptic proxy works better than standard VR, we have developed two systems: one completely virtual called Standard VR and another one using the concept of haptic proxy called Augmented VR with Haptic Proxy. To compare both systems, we devised three simple experiments typically performed in OR; 1) Moving a patient’s hand to specific locations on the chest 2) Performing surgical tool manipulations with the right arm fixed 3) Performing ambidextrous manipulations by grabbing the patient’s right arm and applying a syringe to specific locations. For each of these tasks, we tested the following hypothesis:

  1. H1: The time to execute each task is longer using the system with haptic proxy than with VR controllers because of physical constraints.

  2. H2: The haptic proxy system is more believable in terms of being closer to reality.

3.1 VR simulation system using haptic proxy

As illustrated in Figure 2, the inputs from user interaction with the HTC Vive beacons and wands are transferred to the simulation’s computer. SteamVR (https://store.steampowered.com/steamvr) captures that data. The data are then used by the Unity3D rendering engine to update the virtual patient’s pose and the syringe location.

Figure 2.

Proxy haptic system architecture using Unity3D.

In the proposed system shown in Figure 3, the proxy haptic props are composed of an articulated rubber mannequin to simulate a patient and an HTC Vive wand to simulate a syringe. Both props needed to be tracked and geometrically synchronized with their virtual representation. Both hand positions and finger orientations were measured using Noitom Hi5 glove. In the following section, we will describe how proxy objects are tracked and registered with their virtual counterparts.

Figure 3.

Proxy haptic system composed of a tracked mannequin tracked using HTC Vive beacons and a syringe tracked by a wand.

3.1.1 Tracking and registering the proxy mannequin

The rubber mannequin representing the patient has the same weight as a middle-aged male. Although the mannequin articulations do not offer the same degrees of freedom as an actual human body, it is accurate enough for our experimentation. We explored two approaches to registering the mannequin with its virtual counterpart. The first system consisted of placing numerous retro-reflective targets on the mannequin, as illustrated in Figure 4. Those targets were imaged using a 12-camera OptiTrack (https://optitrack.com/) system from which a cloud of 3D coordinates was estimated using a propitiatory photogrammetric algorithm. The resulting 3D point cloud was then registered with the avatar using a skeleton matching algorithm developed by OptiTrack. A review of human joint estimation from a 3D point cloud can be found in [23].

Figure 4.

Tracking mannequin motion using OptiTrack skeleton tracking system: (a) avatar representation, (b) mannequin with optical targets, (c) 12-cameras OptiTrack tracking configuration mounted on the ceiling, (d) user interacting with the mannequin.

One of the main drawbacks of this approach is its sensitivity to optical occlusions. As the user interacted over the mannequin, his body occludes critical targets generating faulty registrations. Because of occlusion, we concluded that this approach was impractical for our haptic proxy application, especially for open surgical procedures. One could solve the occlusion problem by using a non-optical tracking system such as Ascension 3D trackSTAR magnetic tracking system (https://tracklab.com.au/) that gives 6D (position and orientation) position of small magnetic sensors. Using this technology, each joint can be tracked in 6D and associated with the corresponding control points on the avatar. Our experiments showed that this solution worked well but is extremely costly and sensitive to the presence of metallic objects (such as surgical tools) in the environment. Finally, to animate the virtual mannequin pose, a third low-cost solution was explored. It uses an inverse kinematic (IK) approach where a limited number of HTC Vive beacons are paired with the avatar control points from which the skeleton joint configuration can be calculated directly. Using IK, one needs only to measure a few 3D points located away from occluded areas to animate joint arm configurations. For example, locating a single optical tracker on the mannequin’s right hand is sufficient to animate the right arm joints. We used Unity3D RootMotion advanced character animation asset (http://root-motion.com/) to implement our approach. Since the VR model of the patient had numerous kinematic chains that needed to be animated by the skeleton using the IK engine, we needed to find the value of the ten parameters used by the IK engine that would allow the simulated motion to fit well with the real motion of the mannequin. These parameters are:

  1. Hand Goal Position Weight.

  2. Hand Goal Rotation Weight.

  3. Shoulder Rotation Weight.

  4. Shoulder Twist Weight.

  5. Shoulder Yaw Offset.

  6. Shoulder Pitch Offset.

  7. Arm Bending Goal Weight.

  8. Arm Swivel Offset.

  9. Arm wrist to Palm Axis.

  10. Arm Length Stretching Value.

In addition to the hand tracker, three extra optical trackers were temporally attached to the mannequin’s right shoulder, elbow, and wrist to perform a one-time calibration of the IK parameters. The calibration procedure consisted of determining IK parameters that match best the four control points on the avatar model using the hand tracker as the animating control point. To calibrate, we moved the mannequin’s arm to eight positions and then paired them with the mannequin VR model. The IK parameters are then computed using gradient descent methods that minimize the differences between those two sets of points. Initially, the 10 IK parameters were randomly initialized. For each set of parameters, we calculated the gradient relative to each parameter and changed the parameters in the opposite direction of the gradient multiplied by a small gain. Using this approach, we could find a local minimum as illustrated in Figure 5. In the end, the IK parameters were the ones that gave us the smallest positioning error overall for the eight calibration positions. Once calibrated, only the hand tracker can be used to animate the avatar’s right arm. A similar approach can be found in [24] for full body tracking using HTC Vive beacons.

Figure 5.

IK parameters optimization: Variation of positioning error between the virtual control points and the one measured by the HTV Vive beacons vs number of iterations.

3.1.2 Hand tracking

For interaction with the hand, we used the Noitom Hi5 to measure the orientation of the fingers and an HTC Vive beacon to measure the position and orientation of the wrist. One can see in Figure 3 the system configuration. We used the available SDKs for HTC Vive and Noitom Hi5 VR gloves to integrate our system with the virtual world. With the help of the HTC Vive SDK, we could see the position and orientation of the trackers and controllers. The Noitom Hi5 VR SDK gloves enabled us to see an avatar of the hand in Unity3D along with the movement of the fingers. It also allows the user to pick up items from the virtual world.

3.1.3 Tracking the syringe proxy

For the syringe proxy, we simply use an HTC Vive wand because the base of the wand has a haptic feel similar to a large syringe. The wand tracking information is registered to a syringe model in the VR world using HTC Vive API.

3.1.4 Registration of static objects

We also had to register static proxy objects (operating table and mannequin body) with their VR equivalent. For the table, we attached two HTC Vive optical beacons at the opposite corners of the table and, using Unity3D tools, found the distance between those measured markers and the one in the virtual world. The distance between the sample markers in Unity3D was 174 cm, and the measured distance was: 176 cm. Following this measurement, we re-scaled and re-centre the VR model to match its real-world counterpart. We also registered the location and rotation of the mannequin relative to the table by installing HTC Vive beacons on the mannequin’s head, chest, legs, and arms.

3.2 Standard virtual reality system

For the standard virtual reality system, we used the same environment as the system with the haptic proxy model. However, the difference was that since this was a completely virtual system, the users only interacted with the environment using HTC Vive wands. Once the users grabbed an object in the virtual world, the position of that object was updated in Unity3D based on the position of the wand if the users were holding the grab button on the controller. For the case of moving the patient’s arm, we created a virtual object in Unity3D and used it as the hand end-effector, like the one used for the haptic proxy system. The users were, in reality, moving the end-effector object but what they saw was the patient’s arm being moved with the wand’s movement (see Figure 6).

Figure 6.

Virtual system configuration: The user did not have any physical interactions with the mannequin in this scenario.

Advertisement

4. Experimental comparison between the two systems

To test our assumptions, each task described above was carried out using the haptic proxy and standard virtual reality systems. Also, since using the right or left hand could have added a difference to the results, we decided to do each experiment with two hands. In the end, each task was done four times; p_right (haptic proxy with the right hand), p_left (haptic proxy with the left hand), c_right (controller with the right hand), and c_left (controller with the left hand). We randomized the order of these settings for different participants to reduce the learning effect on our results. Then, according to the order of each participant, we asked them to carry out the different parts of the experiments. In the end, we asked them to answer a questionnaire with some general questions on participants’ information, such as their age and hand dominance, and some questions about their experience with different tasks using haptic proxy and controllers.

In our experiment, since it was a pilot test, we had only 12 participants (8 men and 4 women), all students at the University of Alberta, Computing Science Department aged between 23 to 33 (M = 26.6; SD = 2.6), all of whom saw in stereo. When asked about hand dominance, 11 indicated that they are right-dominant, and one was left-dominant. For the question asking “how often do you play video games” 2 of them indicated that they play video games often, 7 indicated that they play sometimes, and 3 indicated that they never play video games.

Given that we had 12 participants and each task had 10 tests, we had 120-time data for each task for each context. To find outliers in our data, we used the interquartile method [25]. In this approach, we calculate the median, 25th and 75th percentiles of the 120 data points. Next, we subtracted the 25th percentile from the 75th percentile to determine the distance between them. Then we multiplied that distance by 1.5 and added it to the 75th percentile, and we also multiplied the distance by 1.5 and subtracted it from the 25th percentile. We removed any data that was not in that range and called them outliers. We report the number of data points for each section after removing the outliers. Since we originally had 120 data points for each task, the difference in these numbers indicates the number of outliers we had for the task.

Ultimately, we analyzed the remaining data by finding the Normal and Gamma distributions with the best parameters using the Python Scipy library (https://www.scipy.org). Then we performed a Kolmogorov–Smirnov test and chose the distribution with the best p-value as the distribution on our data. Finally, we ran an unpaired t-test to compare the controller usage time with the haptic proxy system. If the best distribution found in any part of the research was Gamma distributed, then we used the mean and STD from the distribution using the Python Scipy library and performed the t-test using the means and STDs. The following sections provide a detailed description of the results of each task.

4.1 Task 1: moving the Avatar’s hand to a specific location on the chest

In every trial, participants had to take place at a starting position. Then, after giving a signal from the experimenter, a timer started, a target appeared on the patient’s right thigh, and participants began to move toward the patient. Next, they grabbed the patient’s arm and moved it to make the right hand thumb touch the target. After that, the target disappeared, and we told them to return to the starting position for the next attempt. All times were measured in Unity3D in second. Figure 7 shows the physical and virtual environment for VR controllers and haptics proxy systems.

Figure 7.

Experiment 1 physical and virtual environments: In the upper left-hand corner, the participant performs the task using the haptic proxy. The upper right is what the participant sees while using the haptic proxy. In the bottom left corner, the participant completes the task with VR controllers. On the lower right, this is what the participant sees when using only the VR controllers.

4.1.1 Grabbing the arm

For experiment1a, the time to grab the arm was measured. The best distribution found for c_left, and p_right was Normally distributed with p-value 0.1325 and 0.7789, respectively. For c_right and p_left the best distribution found was Gamma distributed with p-value 0.7796 and 0.9779, respectively. The results are presented in Table 1.

SettingDist.NormalGammaN
PMeanSTDPαβ
c_leftNormal0.13251.23930.40800.1014263.63840.0253120
c_rightGamma0.58391.21930.36160.779613.06560.1015120
p_leftGamma0.89451.41690.36510.9779127.37590.0323120
p_rightNormal0.77891.37020.41190.327326.90140.0799119

Table 1.

Experiment 1a: Time to grab the arm.

Since the p-value for c_left is below 0.5, we only do the comparison using an unpaired t-test for the right hands. Since c_right does not have a Normal distribution we calculated the mean and STD using the Python Scipy library. The calculated mean and STD for c_right were 1.2193 and 0.3667 respectively. After running an unpaired t-test, our p-value was 0.0031 (a = 0.05, t = 2.9919, df = 237, standard error of difference = 0.050) and therefore, based on our results, the system with haptic proxy (using the right-hand) is slower than the VR wand (controller with the right hand).

4.1.2 Moving the arm

For experiment1b the time to move the arm was measured. The best distribution found for c_left, p_left, and p_right was Gamma distributed with p-values of 0.7704, 0.4425, and 0.9563 respectively. For c_right, the best distribution found was Normally distributed with a p-value of 0.6550. The results are shown in Table 2. Since the p-value for p_left is below 0.5 we only do the comparison using an unpaired t-test for the right hands. After performing an unpaired t-test our p-value was less than 0.0001 (a = 0.05, t = 9.9260, df = 214, standard error of difference = 0.060) and therefore, based on our results, the system with haptic proxy (using right-hand) is slower than the VR wand system (controller with right hand).

SettingDist.NormalGammaN
PMeanSTDPαβ
c_leftGamma0.36150.82170.26300.77043.02950.1611115
c_rightNormal0.65500.80020.21480.59792.46790.1488107
p_leftGamma0.13741.42450.66110.44251.84800.5086113
p_rightGamma0.18531.40030.57090.95633.45530.3164109

Table 2.

Experiment 1b: Time to move the arm.

4.1.3 Returning the arm

For experiment1c the time to return the arm was measured. The best distribution found for all the settings was Gamma distributed. The results are shown in Table 3. In this case, the best p-value for all of them is above 0.5, we will do the t-test for both the left and the right hands. After performing an unpaired t-test for the left hand, the p-value was 0.0021 (a = 0.05, t = 3.1141, df = 231, standard error of difference = 0.028). Therefore, based on our results, in this part of the task, the system with haptic proxy (using left hand) is slower than the VR wand (controller with left hand). The p-value for the t-test for the right-hand settings was 0.1413 (a = 0.05, t = 1.4764, df = 210, standard error of difference = 0.026) and therefore our hypothesis gets rejected in this scenario.

SettingDist.NormalGammaN
PMeanSTDPαβ
c_leftGamma0.13770.65020.19680.81705.17040.0877118
c_rightGamma0.03140.66380.21040.55854.53550.0992112
p_leftGamma0.17330.73690.21760.74513.16500.1264115
p_rightGamma0.47600.70260.16500.518714.62650.0432100

Table 3.

Experiment 1c: Time to return the arm.

4.2 Task 2: Surgical tool manipulations with a fixed arm location

During each test of this task, participants were asked to stay at a specific departure location. Then we gave them a signal and started the timer and the experiment. After hearing the signal, they were instructed to move to the syringe on the table and pick it up. Then they were asked to touch the tip of the syringe at some target on the patient’s right arm. The target position differed for each attempt. When the middle tip came into contact with the target, the target vanished. Participants were instructed to return the needle to its original location. Participants were instructed to return to the starting position and prepare for the next trial (Figure 8).

Figure 8.

Experiment physical and virtual environments: In the upper left-hand corner, the participant performs the task using the haptic proxy. The upper right is what the participant sees while using the haptic proxy. In the bottom left corner, the participant completes the task with VR controllers. On the lower right, this is what the participant sees when using only the VR controllers.

4.2.1 Grabbing the syringe

For experiment2a, the time to grab the syringe was measured. The best distribution found for c_left was Normally distributed with a p-value of 0.4181. For c_right, p_left, and p_right the best distribution found was Gamma distributed with p-value 0.8982, 0.6206 and 0.9595, respectively. The results are shown in Table 4. Since the p-value for c_left is below 0.5, we only do the comparison using an unpaired t-test for the right hands. After running an unpaired t-test for the right-hand setting, our p-value was 0.0001 (a = 0.05, t = 3.9172, df = 237, standard error of difference = 0.050) and therefore haptic proxy system using right-hand is slower than using the controller with the right hand for this part of the task.

SettingDist.NormalGammaN
PMeanSTDPαβ
c_leftNormal0.41811.22580.40010.37051871.91840.0093119
c_rightGamma0.51451.22400.34650.898224.70960.0700119
p_leftGamma0.07621.35230.42890.62068.00610.1537120
p_rightGamma0.62401.42080.42370.959529.99260.0775120

Table 4.

Experiment 2a: Time to grab the syringe.

4.2.2 Moving the syringe

For experiment2b, the time to move the syringe was measured. The best distribution found for c_left was Normally distributed with a p-value of 0.8067. For c_right, p_left, and p_right the best distribution found was Gamma distributed with p-values of 0.9965, 0.9971 and 0.9114, respectively. The results are shown in Table 5.

SettingDist.NormalGammaN
PMeanSTDPαβ
c_leftNormal0.80670.94740.28020.48936.11780.1169118
c_rightGamma0.43640.96100.29130.99657.25300.1090118
p_leftGamma0.24941.06160.33910.99714.54440.1608117
p_rightGamma0.08611.15050.37770.91142.02990.2851115

Table 5.

Experiment 2b: Time to move the syringe.

After performing an unpaired t-test for the left hand, the p-value was 0.0056 (a = 0.05, t = 2.7968, df = 233, standard error of difference = 0.041). Therefore, based on our results, in this part of the task, the system with a haptic proxy (using the left hand) is slower than the VR wand (controller with the left hand). For the right-hand setting, the p-value was 0.0001 (a = 0.05, t = 4.0892, df = 231, standard error of difference = 0.046), and therefore for this part of the task haptic proxy system using the right hand is slower than the virtual system using the right hand.

4.2.3 Returning the syringe

For experiment2c, the time to return the syringe to its original position was measured. The best distribution found for all the settings was Gamma distributed. The results are shown in Table 6. In this case, the best p-value for c_right and p_right is lower than 0.5. Therefore, we can only do the t-test for the left hand.

SettingDist.NormalGammaN
PMeanSTDPαβ
c_leftGamma0.57360.77160.26580.896052.33480.0368120
c_rightGamma0.00220.82410.24780.06325.33620.1042112
p_leftGamma0.29520.87360.32850.741811.30050.0974117
p_rightGamma0.08180.81230.24950.268318.18470.0575106

Table 6.

Experiment 2c: Time to return the syringe.

After performing an unpaired t-test for the left hand, the p-value was 0.0090 (a = 0.05, t = 2.6334, df = 235, standard error of difference = 0.039). Therefore, based on our results, in this part of the task, the system with proxy haptics (using the left hand) is slower than the VR wand (controller with the left hand).

4.3 Task 3: Ambidextrous surgical tool manipulation

In each trial, participants were expected to stand at a particular starting position. Then, upon receiving a signal from the experimenter, a timer started, a target appeared on the patient’s right arm, and participants began to move toward the patient. They then grabbed the patient’s arm and raised it. They were then instructed to take the syringe off the table. Next, they had to make the tip of the syringe reach the target on the patient’s right arm. In this instance, they were free to move or orient the arm to touch the syringe at the target faster.

Furthermore, some targets were on the other side of the arm; that is, they had to rotate the arm to access these targets. Upon contact with the tip of the syringe with the target, the target disappeared, and they had to place the syringe back to its original position on the table, then, they were asked to put the patient’s arm in its original location. Finally, we asked them to return to the starting position for the next trial. Since both hands worked on this task, there was no distinction between the right and left hands. In this scenario, what we mean by the right hand (e.g. p_right) is taking the arm with the right hand and the syringe with the left hand. Since the opposite scenario (i.e., taking the patient’s hand with the left hand and taking the syringe with the right hand) required the participants to be in a cross-hand position which was difficult and not necessarily used in the real world, we only did this task with only one set. Figure 9 shows the physical and virtual environments for both controllers-only and haptic proxy settings.

Figure 9.

Experiment 3 physical and virtual environments: In the upper left-hand corner, the participant performs the task using the haptic proxy. The upper right is what the participant sees while using the haptic proxy. In the bottom left corner, the participant completes the task with VR controllers. On the lower right, this is what the participant sees when using only the VR controllers.

4.3.1 Grabbing the arm

For experiment3a, the time to grab the arm was measured. The best distribution found for c_right was Normally distributed with a p-value of 0.4546. For p_right the best distribution found was Gamma distributed with a p-value of 0.5591. The results are shown in Table 7.

SettingDist.NormalGammaN
PMeanSTDPαβ
c_rightNormal0.45461.19890.40630.24784.17340.2104118
p_rightGamma0.48011.38850.41220.559116.46050.1027118

Table 7.

Experiment 3a: Time to grab the arm.

Since the p-value for c_right is less than 0.5, one cannot assign a distribution to c_right, we cannot perform the t-test in this case.

4.3.2 Grabbing the syringe

For experiment3b, the time to grab the syringe was measured. The best distribution found for both settings was Gamma distributed. The results are shown in Table 8.

SettingDist.NormalGammaN
PMeanSTDPαβ
c_rightGamma0.30660.91550.30620.96366.40650.1225115
p_rightGamma0.24670.90890.54000.58602.46300.3606116

Table 8.

Experiment 3b: Time to grab the syringe.

For the right-hand setting, the result of running an unpaired t-test was a p-value = 0.9129 (a = 0.05, t = 0.1095, df = 229, standard error of difference = 0.060). Therefore, we cannot say anything about the controller being faster than the proxy haptics in this scenario.

4.3.3 Moving the syringe

For experiment3c, the time to move the syringe was measured. The best distribution found for all the settings was Gamma distributed. The results are shown in Table 9.

SettingDist.NormalGammaN
PMeanSTDPαβ
c_rightGamma0.56820.86920.22020.839920.62690.0486108
p_rightGamma0.04731.04680.35680.84663.45210.1960112

Table 9.

Experiment 3c: Time to move the syringe.

After performing an unpaired t-test for the right hand, our p-value was less than 0.0001 (a = 0.05, t = 4.3542, df = 218, standard error of difference = 0.041). Therefore, based on our results, in this part of the task, the system with haptic proxy (using the right hand) is slower than the VR wand (controller with the right hand).

4.3.4 Returning the syringe

For experiment3d, the time to return the syringe was measured. The best distribution found for all the settings was Gamma distributed. The results are shown in Table 10. Since the p-value for c_right is less than 0.5, we cannot assign a distribution to c_right, we cannot perform the t-test in this case.

SettingDist.NormalGammaN
PMeanSTDPαβ
c_rightGamma0.01200.75980.25980.37874.30890.1246114
p_rightGamma0.05990.92820.39090.98791.95790.2920117

Table 10.

Experiment 3d: Time to return the syringe.

4.3.5 Returning the arm

For experiment3e, the time to return the arm was measured. The best distribution found for all the settings was Normally distributed. The results are shown in Table 11. Since the p-value for c_right and p_right is less than 0.5, we cannot assign a distribution to c_right, we cannot perform the t-test in this case.

SettingDist.NormalGammaN
PMeanSTDPαβ
c_rightGamma0.22260.97120.36660.203129.21450.0682116
p_rightGamma0.44840.91890.46920.23647.90180.1713115

Table 11.

Experiment 3e: Time to return the arm.

4.4 Questionnaire

Our questionnaire consisted of 7 sections; Section 1 consisted of general questions about the user’s physical characteristics, such as their hand dominance and if they wear glasses, and the following six questions were about the system. We had three tasks (discussed above) and two different parameters (controller or proxy haptics), and we had a questionnaire for all of them. In general, the questions for each of these six sections were identical, with minor differences depending on the section. Each section consisted of three groups of questions; 1) Perceived Competency Scale asking questions about how confident and comfortable users were working with the system. 2) System Usability Scale asking questions on how the system was usable in the participant’s opinion 3) Believability asking how the system was near the real world.

Each group of questions consisted of a few questions or statements about that group. The users had to choose, on a scale from 1 to 7, with seven meanings ranging from how they agreed or disagreed with each statement. Some statements were negative (for example, I found the system very cumbersome), while others were positive. In the end, we applied a function to all these responses for each group to get a score between 1 and 100. Details on scores are presented in Table 12. However, because they were different in nature, we could not merge all the questions in the credibility section. For example, one question asked how much they thought they were facing a real problem, and another asked whether they saw a difference between handling the right and left hands. We denoted”I felt like I am dealing with a real-world arm’ as the question directly asked about the believability of the system, and we mapped the values of the responses to a number between 1 and 100 as a believability score. We will discuss other questions in the believability section later. At the end of the questionnaire, participants were also asked for their views on the system.

SectionSystem CompetenceSystem UsabilityBelievabitiy
MeanSTDMeanSTDMeanSTD
Experiment1 with controller83.333314.793980.381912.097641.666727.9791
Experiment1 with proxy haptics75.925916.124780.555615.058176.388918.0604
Experiment2 with controller81.944413.002078.645816.189354.166731.0791
Experiment2 with proxy haptics86.57419.610386.80557.555276.388924.0562
Experiment3 with controller71.296312.720273.958313.69025032.5669
Experiment3 with proxy haptics84.259312.720281.250017.064476.388920.6685

Table 12.

Questionnaire scores SED stands for standard error of differences. The responses are from the 12 participants.

We next preformed an unpaired t-test with a = 0.075 for all the experiments to compare the system with only controller with the system with haptic proxy. The results are shown in Table 13.

ExperimentsResult
PtdfSED
Experiment 10.00153.6119229.613
Experiment 20.06291.95872211.345
Experiment 30.02702.37002211.135

Table 13.

Unpaired t-test results from 12 participants.

The results show that our hypothesis H2 is valid for all the experiments meaning the participants view the system with haptic proxy as more believable and closer to the real world than the pure VR version. Also, based on Figure 10, users appeared to feel less difference between handling the right and left hand using the haptic proxy.

Figure 10.

Responses for the question”I feel no difference between the right- and left-hand manipulations”.

Advertisement

5. Conclusion

When analyzing the data, we found that in most cases, actions using the haptic proxy simulator are slower than just the VR control system. In fact, in 8 out of 10 different simulations for which we could find a distribution, the haptic proxy was slower. This is normal since by adding a proxy, the movement is restricted by the law of physics. On the other hand, with the standard VR system, users are free of motion restrictions and can move the patient’s hand or tools anywhere. We often observed users taking the syringe from inside the patient’s body to the target because it was a shorter path, which is impossible in the real world. One advantage of haptic proxy is that collision detection is ensured by the law of physics as opposed to other techniques that require computationally expensive collision detection algorithms.

Based on the questionnaire, all tasks were more credible in the virtual world, and as a result, our H2 hypothesis is validated. Moreover, when asked participants about their opinion, out of the nine who responded to this question, 6 preferred haptic proxy over the controller-only system, saying things like it “Glove with haptic proxy feels more natural”, and “With the glove with haptic proxy one feels the weight of things”. In addition, when asked about how much they liked the HTC Vive controller as a proxy to the syringe, out of 9 who answered this question, 5 had positive feedback about it. One participant said, “Great, did not even notice the first time”.

In our work, we have attempted to resolve haptic perception in virtual reality toward applications to open surgical training. We did so by real-world mapping objects as proxies for some virtual objects. In our pilot experiment, we used a mannequin as a proxy for a patient and an HTC Vive controller as a proxy for a syringe. The pilot project showed that augmented virtual reality using a haptic proxy could improve the sense of touch for simulations based on virtual reality without needing haptic robots or exotic skeletons. Since the VR-based proxy haptic concept relies heavily on accurately registering virtual and natural objects, occlusion problems were a limiting factor for extending this concept for open surgery simulations. The combination of IK and single-point tracking solved the problem for our simple simulation. Still, they cannot be generalized for complex tasks performed during open surgeries where occlusion will be a significant issue.

From the lessons learned, a new open appendectomy surgical training system is currently under development. In this training system, a combination of fused optical/magnetic trackers and digitizing gloves with pressure sensors will be used to determine the surgeon’s body (head and hands) and the 6D position and orientation of the proxy props. In addition, the combination of optical and magnetic tracking will allow us to address the occlusion problem and track the small tools used during the intervention. Figure 11 shows the proposed block diagram of the new appendectomy simulator. Much work remains to be done to develop a fully functioning open surgery simulator. Still, these pilot studies suggest that using the haptic proxy concept can help achieve this goal.

Figure 11.

System diagram of the new appendectomy training system.

References

  1. 1. St-John A, Caturegli I, Kubicki N, Kavic S. The rise of minimally invasive surgery: 16 year analysis of the progressive replacement of open surgery with laparoscopy. Society of Laparoscopic and Robotic Surgeons. 2020;24(4):1-5
  2. 2. Bhagat KK, Liou WK, Chang CY. A cost-effective interactive 3D virtual reality system applied to military live firing training. Virtual Reality. 2016;20(2):127-140
  3. 3. Hamilton MF. Virtual reality: Can it improve the PICU experience? Pediatric critical care medicine—Society of. Critical Care Medicine. 2019;20(6):587-588
  4. 4. Ström P, Kjellin A, Hedman L, Johnson E, Wredmark T, Felländer-Tsai L. Validation and learning in the Procedicus KSA virtual reality surgical simulator. Surgical Endoscopy and Other Interventional Techniques. 2003;17(2):227-231
  5. 5. Seymour NE, Gallagher AG, Roman SA, Obrien MK, Bansal VK, Andersen DK, et al. Virtual reality training improves operating room performance: Results of a randomized, double-blinded study. Annals of Surgery. 2002;236(4):458
  6. 6. Van Sickle K, McClusky D III, Gallagher A, Smith C. Construct validation of the ProMIS simulator using a novel laparoscopic suturing task. Surgical Endoscopy and Other Interventional Techniques. 2005;19(9):1227-1231
  7. 7. Duffy A, Hogle N, McCarthy H, Lew J, Egan A, Christos P, et al. Construct validity for the LAPSIM laparoscopic surgical simulator. Surgical Endoscopy and Other Interventional Techniques. 2005;19(3):401-405
  8. 8. Roberts KE, Bell RL, Duffy AJ. Evolution of surgical skills training. World Journal of Gastroenterology: WJG. 2006;12(20):3219
  9. 9. Huber T, Paschold M, Hansen C, Wunderling T, Lang H, Kneist W. New dimensions in surgical training: Immersive virtual reality laparoscopic simulation exhilarates surgical staff. Surgical Endoscopy. 2017;31(11):4472-4477
  10. 10. Pulijala Y, Ma M, Ayoub A. VR surgery: Interactive virtual reality application for training oral and maxillofacial surgeons using oculus rift and leap motion. In: Serious Games and Edutainment Applications. Cham: Springer; 2017. pp. 187-202
  11. 11. Pulijala Y, Ma M, Pears M, Peebles D, Ayoub A. Effectiveness of immersive virtual reality in surgical training—A randomized control trial. Journal of Oral and Maxillofacial Surgery. 2018;76(5):1065-1072
  12. 12. Lombard M, Ditton T. At the heart of it all: The concept of presence. Journal of Computer-mediated Communication. 1997;3(2):JCMC321
  13. 13. Ganni S, Li M, Botden SM, Nayak SR, Ganni BR, Rutkowski AF, et al. Virtual operating room simulation setup (VORSS) for procedural training in minimally invasive surgery–a pilot study. Indian Journal of Surgery. 2020;2020:1-7
  14. 14. Perret J, Vander PE. Touching virtual reality: A review of haptic gloves. In: ACTUATOR 2018; 16th International Conference on New Actuators. VDE. Bremen, Germany: IEEE Xplore; 2018. pp. 1-5
  15. 15. Tiboni M, Borboni A, Verite F, Bregoli C, Amici C. Sensors and actuation technologies in exoskeletons. A Review. Sensor. 2022;22(3):1-61
  16. 16. Pacchierotti C, Sinclair S, Solazzi M, Frisoli A, Hayward V, Prattichizzo D. Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE Transactions on Haptics. 2017;10(4):580-600
  17. 17. Srinivasan MA, LaMotte RH. Tactual discrimination of softness. Journal of Neurophysiology. 1995;73(1):88-101
  18. 18. Adams MJ, Johnson SA, Lefèvre P, Lévesque V, Hayward V, André T, et al. Finger pad friction and its role in grip and touch. Journal of The Royal Society Interface. 2013;10(80):20120467
  19. 19. Keef CV, Kayser LV, Tronboll S, Carpenter CW, Root NB, Finn N III, et al. Virtual texture generated using elastomeric conductive block copolymer in a wireless multimodal haptic glove. Advanced Intelligent Systems. 2020;2(4):1-8
  20. 20. Oskouie MA, Boulanger P. Using proxy haptic for a pointing task in the virtual world: A usability study. In: International Conference on Augmented Reality, Virtual Reality and Computer Graphics. Santa Maria al Bagno, Italy: Springer; 2019. pp. 292-299
  21. 21. Simeone AL, Velloso E, Gellersen H. Substitutional reality: Using the physical environment to design virtual reality experiences. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. Seoul Republic of Korea; 2015. pp. 3307-3316
  22. 22. Henderson SJ, Feiner S. Opportunistic controls: Leveraging natural affordances as tangible user interfaces for augmented reality. In: Proceedings of the 2008 ACM Symposium on Virtual Reality Software and Technology. Bordeaux France; 2008. pp. 211-218
  23. 23. Xu T, An D, Jia Y, Yue Y. A review: Point cloud-based 3D human joints estimation. Sensor. 2021;21(5):1-30
  24. 24. Caserman P, Garcia-Agundez A, Konrad R, Steinmetz R. Real-time body tracking in virtual reality using a Vive tracker. Virtual Reality. 2019;23(2):155-168
  25. 25. Tukey JW. Exploratory Data Analysis. Reading, MA: Addison Wesley Publishing; 1977

Written By

Pierre Boulanger, Thea Wang and Mahdi Rahmani Hanzaki

Submitted: 15 August 2022 Reviewed: 27 September 2022 Published: 03 November 2022