Open access peer-reviewed chapter

Augmented Reality in Kidney Cancer

Written By

Keshav Shree Mudgal and Neelanjan Das

Submitted: 29 June 2018 Reviewed: 05 October 2018 Published: 31 December 2018

DOI: 10.5772/intechopen.81890

From the Edited Volume

Evolving Trends in Kidney Cancer

Edited by Sashi S. Kommu and Inderbir S. Gill

Chapter metrics overview

1,174 Chapter Downloads

View Full Metrics

Abstract

Augmented reality(AR) is the concept of a digitally created perception that enhances components of the real-world to allow better engagement with it. Within healthcare, there has been a recent expansion of AR solutions, especially in the field of surgery. Traditional renal cancer surgery has been largely replaced by minimally invasive laparoscopic (or robotic) partial nephrectomies. This has meant loss of certain intra-operative experiences such as haptic feedback and AR can aid this replacement with enhanced visual and patient-specific feedback. The kidney is a dynamic organ and current AR development has revolved around specific surgical stages such as safe arterial clamping and perfecting tumour margins. This chapter discusses the current state of AR technology in these areas with key attention to the aspects of image registration, organ tracking, tissue deformation and live imaging. The chapter then discusses limitations of AR, such as intentional blindness and depth perception and provides potential future ideas and solutions. These include inventions such as AR headsets and 3D-printed renal models (with the possibility of remote surgical intervention). AR provides a very positive outcome for the future of truly minimally invasive renal surgery. However, current AR needs validation, cost evaluation and thorough planning before being safely integrated into everyday surgical practice.

Keywords

  • augmented reality
  • AR
  • nephrectomies
  • partial nephrectomies
  • image registration
  • surface registration
  • organ tracking
  • tissue deformation
  • renal artery clamping
  • safe selective arterial clamping
  • precise tumour margin
  • live imaging
  • virtual reality
  • AR headset
  • 3D printing

1. What is augmented reality?

The term “augmented reality” was coined by the Boeing researcher, Thomas Caudell in 1990 to describe a projection of digital graphics onto a physical working space for use by aircraft engineers [1]. Augmented reality (AR) has come a long way since, however the fundamental idea remains the same: working in a real-world environment where the components of the environment are enhanced by a digitally-created perception. This perception can include multiple sensory inputs including visual, auditory, haptic, somatosensory and olfactory senses.

In healthcare, AR progression has involved a wide range of medical areas—from aiding clinic appointments by easy access to electronic health records and patient times, to wearable glasses that help teach life-skills to children on the autism spectrum [2]. However, it seems that the biggest expansion of AR is seen in enhancing surgical procedures. Project DR is one such development—internal organs as 3D reconstructions of the patient’s anatomy are projected onto the patient’s skin and this allows a constant view of the person’s anatomy that moves with patient in real time. This is achieved by the amalgamation of CT/MRI imaging, motion-sensing infra-red sensors and projectors all working as one unit [3]. Another example is the use of Microsoft’s HoloLens glasses for trauma and plastic surgeries (Imperial College London and St Mary’s Hospital). The “hologram” (made from pre-op imaging) through the lens of the glasses projects onto the patient’s skin and allows a “mixed reality” which lets the surgeon track the pathways of the various blood vessels and bones to be operated upon [4]. This technology promises to let surgeons carefully plan and execute breast reconstruction surgeries in the future. Google glass is another extensively used example of AR. The ‘glasses’ allow an augmented field of view and surgeons have used these for all purposes from navigation tools to display ultrasound imaging, to remote videoconferencing in intraoperative communication [5].

Advertisement

2. Augmented reality vs. virtual reality?

Whilst augmented reality is technology that overlays on the reality that already exists around us, virtual reality (VR) is a complete replacement of the real world with a simulated one. This means that AR allows us to interact and work with the real world, whilst getting an enhanced input from an informed digital world.

VR has shown to be a great teaching tool. Moglia et al. [6] found that subjects trained on virtual simulators were better than the control group (using conventional methods). An example is the Uro Tainer, a validated simulator for teaching transurethral resection of bladder tumours [7]. VR has not only shown promise in surgery, but also other areas like simulation of shock trauma centres (where surgeons can be trained in high pressured environments) [8] and Virtual Environment for Radiotherapy Training (VERT)—a system built to reduce anxiety in breast cancer patients [9].

Advertisement

3. VR in kidney cancer

Within renal cancer, a VR system has been developed by Rai et al. that enhances the novice’s ability to localise renal tumour margins [10]. Specific to nephrectomies, Makiyama et al. [11] have developed a VR “rehearsal” simulator for surgeons that plans for anatomical abnormalities and incorporates haptic feedback for pre-operative training. Ueno et al. have developed VR addressing another aspect of nephrectomies—reducing postoperative urine leakage by predicting open urinary tracts on preoperative 3D CT—reconstructions [12]. Whereas, VR might be a great teaching tool in simulated nephrectomies, AR is the platform for practise of medicine and surgery—on real people.

Advertisement

4. Augmented reality in renal cancer

In renal cancer surgery, open nephrectomies for the most part, have been replaced by minimally invasive laparoscopic surgeries. This has led to many positive outcomes, including decreased intra-op blood loss and shorter hospital stay [13]. Furthermore, partial nephrectomies have shown an overall improved survival over radical nephrectomies, [14] and this has been made possible due to crucial development in laparoscopic and robotic-assisted surgery [15]. However, there have also been some drawbacks. One of these is the loss of haptic feedback that would usually allow the surgeon to manoeuvre intra-operatively and make instinctive decisions. AR can aid the loss in this feedback sense by replacing it with an enhancement in another—the visual sense.

AR systems exist that allow surgeons to see detailed anatomical structures on the surface of the organ by projecting pre-operative CT/MRI images onto a live laparoscopic video. This allows the view of the patient-unique renal anatomy, it’s neighbouring structures and its relation to the rest of the intra-abdominal anatomy [16]. Having this added information can aid the surgeon in planning and executing an accurate and precise partial or total nephrectomy. Exact areas to be incised can be planned and damage to nearby delicate structures such as renal vasculature and ureters can be reduced. AR can also reduce excision margins—to spare as many well-functioning nephrons and reduce the risk and progression of chronic renal insufficiency [17] (Figure 1).

Figure 1.

An illustration showing the basic components involved in AR. “the basic method is to superimpose a computer generated image on a real-world imagery captured by a camera and displaying the combination of these on a computer, tablet PC or a video projector. The main advantage of AR is that the surgeon is not forced to look away from the surgical site as opposed to common visualisation techniques.” Adapted from: ‘Recent Development of Augmented Reality in Surgery: A Review’ [20].

For an AR system to be ideal, the full length of a surgical procedure need to be “augmented.” This requires 3 essential features (as adapted by a recent review by Hughes-Hallet et al. [18]): image registration, organ tracking and adapting to intra-operative tissue deformation. In the following chapter, I will describe, in detail, these aspects of AR specific to nephrectomies.

Advertisement

5. Image registration

This is the process where a medical imaging is aligned with the patient’s anatomy to form a visually projected overlay. This can be done through various methods, but the fundamentals rely upon multiple data points being processed to align medical imaging with the best corresponding surface-landmarks on an organ/patient’s anatomy.

Image registration can be used in pre-operative planning stage and in the intra-operative stage. The planning stage involves using the combined imaging overlay (of CT scans and MRIs) onto the kidney to identify key structures of importance—hilar vasculature, the spatial attributes of the kidney and their relationship with the renal collecting system. This helps build a roadmap of what the surgery will involve and although it does not require accuracy to the millimetre (as is involved in the intra-operative phase) it allows a pretty good estimation to the planned steps in surgery.

The intra-operative stage requires higher precision image registration and more importantly, dynamic correspondence with the moving organ. This is to allow tumour resection margins that can be accurate to the millimetre.

Hughes-Hallet [18] classify image registration used by AR developers into 3 main subtypes: manual registration, surface-based registration and 3D registration (Figure 2).

Figure 2.

Different methods of image registration [18].

Manual registration is the simplest method where the surgeon uses their anatomical knowledge to align a projected imaging onto the organ. Examples of this method have involved: fusing 3D reconstructions with live operative view, projecting intra-abdominal anatomy onto the skin and colour coded projection to highlight “safe zones” of resection margins. These have offered quite a hands-on approach with relatively little planning compared to other registration methods—allowing a low barrier to entry, good anatomical orientation and relatively good awareness of disease-free parenchyma. Manual registration does not allow accurate estimates of tumour margins and is mostly limited to the planning phase.

Surface based registration method involves using a tracked instrument to build a topographical map of the internal organ. This has been most extensively used in robotic partial nephrectomy—an example being the da Vinci robot. The laparoscopic instrument tip of the robot can touch a point on the live kidney and this information along with the joint positions sense of the robotic arms can calculate a position in space to build a surface anatomy of the kidney. The surface anatomy can then be correlated with 3D reconstruction imaging of the patient and projected onto the surgeon’s operating view.

This method is a form of internal tracking relying on a good tracking instrument and accurate computer algorithms to provide the correct position of the organ being operated on. Errors in the estimated spatial position of the instrument tip can create unsafe image registration that would be inaccurate for tumour margins of partial nephrectomies.

This registration method is more accurate than manual registration as it provides automation and reduces the surgical workload. However, there are still areas for improvement such as better tracking methods. External tracking, where the organ is tracked from the outside is another method. Examples include optical, magnetic and laser but there are inherent drawbacks such as optical tracking requires a direct line of sight and magnetic tracking requiring physical placement of a tracer in the organ.

A recent advancement in surface-based registration was developed by Edgcumbe et al. where a miniature laser projector called ‘Pico lantern’ can be dropped into the patient’s abdomen. It can then be picked up by laparoscopic instruments and perform surface recognition on the abdominal organ. Using surface-based recognition, it can then project the pre-operative image on the surface of the organ. The projector is visually tracked in the laparoscopic field of view and has been tested in porcine kidneys and in detecting pulsatile motion in carotid arteries [19].

3D-CT stereoscopic image registration uses 2 cameras that focus on the same object in space and a combination of the perspectives from 2 separate viewpoints allows a spatial reference for the object. This system has been used for kidneys by isolating a point on the surface of the organ, which can then be aligned to the corresponding point on a 3D reconstruction image. Using this as the centre of rotation, the surgeon can manipulate the operative view for a better understanding of field of operation. This system has shown average target registration error of less than 1 mm and mean registration duration of 48.1 s i.e. more accurate resection margins, faster registration and reduced theatre time.

There is however, a lack of organ tracking in this system, meaning that each movement of the camera needs a new image registration. Furthermore, stereoscopic cameras are usually very expensive and require large handling computer systems, which is another limitation to adoption of this method.

Advertisement

6. Organ tracking

An ideal AR system would allow the organ to be tracked in real time as it is affected by respiration, tissue deformation and other complications like bleeding. The renal system is particularly vulnerable to these dynamic changes compared to other organs systems like the brain or bones where rigid image registration systems are the norm and do not require as much tracking. The registered image projected into the operative view should be locked and dynamically move with the organ and the laparoscopic camera.

There are many types of tracking studied in AR surgery. These have mainly included optical tracking where optical markers on the organ allow a measured position of the laparoscopic camera relative to the organ and move with the organ [20]. Infra-red tracking is another method which involves the use of infrared-emitting diode markers. The main issue with optical tracking is instruments obscuring the direct view of field (required for tracking) and a limited depth perception. Infra-red tracking has the issue of selecting the correct anatomical landmarks as markers—mismatches can occur due to deformations, compression and intraoperative haemorrhaging. Many studies have failed to achieve accurate registration of dynamic intra-abdominal organs with infra-red [21].

Electromagnetic tracking is another way of doing this—this has been explored by use of the wireless trackers an ex-vivo bovine partial nephrectomy model. This involved surrounding the tumour within the kidney with magnetic transponders which relayed back to the surgeon and in conjunction with optical camera tracking, a partial nephrectomy was performed [22]. There are limitations to this tracking method as magnetic fields can have interference from laparoscopic instruments and operating tables. The method also requires placement of the magnetic transponders into the target organ and it is currently hard to achieve an alignment error of less than 5 mm—a tumour margin error too great for accurate partial nephrectomies [23].

An alternative has been explored by Yip et al. where 3D stereoscopic image registration has been combined this with tracking algorithms—producing only 1.3–3.3 mm degree of error [24]. More recently, Edgcumbe et al. have developed a tracking device called the Dynamic Augmented Reality Tracker (DART). This is a 3D-printed stainless-steel tracker that can be anchored to a fixed position on the kidney relative to the tumour. This, with the help of an ultrasound transducer, can then be used to track the location of surgical instruments relative to the tumour in real time. This system has been named ARUNS (Augmented Reality Ultrasound Navigation System) and was used in the robotic-assisted excision of a phantom kidney tumour [25].

Vávra et al., in their recent review, comment that it may be possible to track organ movement without physical markers in the future. Some of these methods explored include algorithms to predict real-time movement of organs, physics-based deformation models, natural points of reference as tracking points and the use of red-green-blue cameras to perform image registration without markers. They also comment that whilst the average marker associated registration takes 8 min, a recent marker-less system only took about 5 min [20].

Advertisement

7. Tissue deformation

The kidney is a dynamic organ and renal surgery involves deformation of the anatomy. Every step in the operation changes the initially projected image registration [26] and for an ideal AR system, there need to be real time feedback for tracing this. An answer to this would be the development of computer algorithms to predict changes in anatomy at crucial steps such as clamping of the renal arteries and surgical dissection at tumour margins. Algorithms can also be developed to predict the effect of ongoing influences on the organ position such as respiratory patterns and peritoneal insufflation.

There have been some tissue deformation models considering renal clamping, incision and external pressure loads to the kidney like intra-operative insufflation. Some of these have shown an improvement of 29% in the registration error when compared to a non-deformation model. However, in these models, the kidney is assumed to have a linear elastic behaviour and the models have been based on ex-vivo kidneys. Another method has been developed taking the diaphragm motion and its influence on kidney motion into consideration—this has used preoperative CT scans during inspiration and expiration and computer errors have been shown to be less than 2 mm in predicting kidney positions [27].

Although there is a scarcity of studies in this area, one study showed that mathematical models were able to predict up to 52% of the operative deformation in porcine kidneys when compared to pre and post-op CT imaging [28].

Baumhauer et al. [29] have proposed a system to answer tissue deformation by insertion of custom-designed navigation aids into the kidney and using “inside-out tracking.” This is where the CT-scan can provide real-time spatial awareness by identifying the navigation aids and projecting imaging onto the laparoscopic view. Tested in a virtual environment, this system showed a visualisation error of 1.36 mm (adequately accurate). This system was replicated in the clinical setting by Teber et al. [30] where 10 patients had retroperitoneal laparoscopic partial nephrectomies. The results showed zero cases with a positive surgical margin, zero complication rate and zero conversion to open surgery. This system does however, require placement of aids (like 1.5 cm long needles) into the kidney and is dependent on at least 4 aids being present. This brings risks of damage to healthy parenchyma and aids being lost intra-op.

An answer to tissue deformation could lie closer to technology used within the commercial sector. Advanced facial recognition software used in simulating real time ‘Apple animojis’ [31] could be adapted to intra-operative kidneys. The software could delineate an optical real time map of the kidney which would change with the active deformation.

Advertisement

8. Live imaging

Live imaging is an answer to capturing tissue deformation as it allows real-time dynamic information on the kidney and removes the need to ‘estimate’ structural changes in the tissue. Ultrasound is one live imaging modality that has shown high sensitivity and specificity at identifying tumour margins [32, 33]. There have been several studies using live USS to aid AR. Kang et al. [34] merged live laparoscopic ultrasound images on stereoscopic video and showed accuracy of image-to-video correlation of up to 2.76 mm. Kang et al. claim this aids in depth perception and better visualisation of internal structures. Cheung et al. demonstrated that a fused video-USS model for phantom partial nephrectomy allowed for a 1.1 mm tumour resection margin (with 2D fusion) for endophytic tumours [35].

Singla et al. [36] showed in their study, that simulated healthy renal tissue excised was reduced from an average 30.6–17.5 cm3 using intra-operative USS based AR. This technique would be especially beneficial in critical structures like endophytic renal tumours where most of the tumour lies below the organ surface—(endophytic tumours currently have complication rate of nearly 50%).

These are all however preliminary studies and are based on phantom models which does not represent the true nature of the operation in vivo. A majority of studies have involved manual registration with labour intensive methods that are unrealistic to be currently used in-vivo. Cheung et al. found that although there was 29% reduction in planning time with the USS-fused model, the tumour required longer operative times (being up to 39% slower than the conventional system) [35].

Some projects have combined all three aspects of AR named above. An example of this is PARIS (projector-based AR intracorporeal system)—a method by Edgcumbe et al. [37] where there is a combination of a tracked projector, tracked marker and laparoscopic ultrasound transducer. This has been used in 16 simulated laparoscopic partial nephrectomies, where cancerous tumours were projected onto the kidney surface and this projection moved with kidney. An ultrasound allowed live imaging of the intra-operative environment. This study showed better identification of underlying anatomy and tumour boundaries to show signification reduction in healthy tissue excised.

Advertisement

9. Specific aspects of renal cancer surgery focused upon in AR development

There are a few aspects of renal cancer surgery (highlighted by Detmer et al. [27]), that AR development has specifically focused on. These include precise tumour resection and safe selective arterial clamping.

We have covered the issue of a precise tumour resection to preserve maximum healthy tissue throughout the chapter. However, Detmer et al. specifically mention some studies to tackle this very area. Ukimara and Gill [38] describe using different colours to signify increasing distance from the tumour and this is overlaid on the AR field of view. Another method uses contouring of the organ around the tumour margin to highlight the tumour itself. Uncertainty of the tumour margin has also been encoded by using different colours to signify certain and uncertain areas of the margin [27].

Renal artery clamping is a crucial procedural step as ischemia needs to be limited to tumour-specific parenchyma. This is done by identifying and clamping only the tumour specific arterial branches (usually tertiary or higher-order). This concept has been described as “zero-ischaemia” [39]. There have been some studies detecting renal vessels underneath the organ surface and several studies aiming to identify arterial branches for selective clamping with variable success. These have been used for pre-operative planning and some intraoperative guidance. Development has been mostly based in manual registration techniques with displays over the laparoscopic view or on a separate screen [27].

Advertisement

10. Current limitations

This chapter has described the different aspects of AR in renal cancer and areas that have seen progress. However, there are current limitations holding back the use of AR technology in the clinical set-up. Vavar et al. and Detmer et al. have highlighted some of these (as below):

  1. Pre-operative medical imaging: currently all reconstructed images need preparing in advance by powerful processing systems. Not only is this expense, it is time consuming. With technological advancement—real-time high-resolution medical imaging and 3D reconstructions could be the norm. This would be displayed in real time intraoperatively and would markedly reduce/eliminate pre-op times.

  2. Inattentional blindness: not seeing an unexpected object in the field of view. This has especially been an issue with 3D image registrations, where the surgeon does not register an object as they were not expecting it to be part of that procedural step. With the development of AR headsets, there will be more information displayed to the operating surgeon. This can be distracting and there needs to be a conscious effort towards only displaying vital information or switching between different sets of information. Further work needs to be done on reducing human factor issues and making the human-computer interface more ergonomic.

  3. Depth perception: whilst current image registration involves imaging modalities such as MRI and CTs, the 2D or 3D imaging does not allow a full understanding of intraabdominal environment. Minimally invasive surgery has especially deprived the depth perception aspect of the surgical experience. Vávra et al. [20] suggest depth-sensing cameras could aid AR in the future.

  4. Hardware capacities: current AR is limited by the hardware capacities and thus processing power of computer systems they are run on. Vizua is a company developing “cloudification” and “application roaming” where the AR applications and data can be remotely managed to get the highest computing power and access to large datasets. A platform such as this could be incorporated into AR systems and answer issues of latency in image registration and access to good quality imaging (live or pre-operative).

  5. Other issues mentioned include tissue deformation studies only focusing on one source of deformation, 3D imaging requiring better image registration and the issue of simulation sickness (whilst using heavy current AR headsets).

Regardless of what aspect of AR being explored, there is little quantitative data on in-vivo procedures. Only 20 studies were found by a recent review [27] where AR had been used in clinical practice, and only 9 studies had 10 or more patients in the study. There is a crucial need for clinical validity to show improved patient outcomes and safety from using AR in renal interventions.

11. Future ideas/solutions

As Hughes-Hallet mention, there is a “one-size-fits-all” approach in most AR developments. Single imaging and imaging registration modalities have been used in isolation for many systems. Every surgical case is different however, and a combination of different modalities can provide a more accurate answer. Below are some examples of adjuncts that could aid current AR:

  1. NIRF—near infrared fluorescence is a type of imaging where indocyanine green (ICG) dye is injected into the body and it can be used to illuminate intravascular renal parenchyma. This can allow the surgeon to detect blood vessels under the organ surface and detect tissue abnormalities. Although NIRF has not shown much promise in predicting malignancy in partial nephrectomies, it has shown a reduction in global renal ischaemia. NIRF has been used in robot-assisted surgery to achieve super-selective arterial clamping—avoiding main arterial clamping in 65% of patients in a recent study. Infusion of ICG dye pre and post arterial clamping ensured that there was selective ischaemia only to the tumour region and adequate renal perfusion was achieved post-clamp removal. This imaging could be used in conjunction with AR to further aid live visual feedback, organ tracking and have better post-op renal functioning [40].

  2. Imperial College London’s iKnife could be used in conjunction with AR [41]. This ‘Intelligent knife’ is a surgical scalpel that chemically tests the tissue it has contact with. It uses Rapid Evaporative Ionisation Mass Spectrometry (REIMS) for real time analysis of the aerosols created from diathermy of tissues. iKnife has been used in gynaecological tissues to distinguish between normal, borderline and malignant tissue [42] and this could be used in partial nephrectomies to give real time feedback for a precise tumour margin.

  3. AR headset—this is a device that is being engineered simultaneously in many major US hospitals. Dr. Varshey and Dr. Murthi, are developing one such headset with the engineering team at the “Augmentarium” (University of Maryland). They hope to develop a system where a headset such as the Microsoft HoloLens can be worn by the surgeon and real-time USS of the patient or vital signs and patient data can be overlaid on the felid of view. This would drastically reduce the number of displays a surgeon has to usually track during an operation. Used in conjunction with dynamic image projection, the AR headset would be a good answer to cover the abovementioned 3 aspects of AR in partial nephrectomies [43].

    The AR headset hopes to eliminate any obstructions in the surgeon’s view as compared to conventional methods. Furthermore, voice recognition and gesture recognition development would enable hands-free control of the device—which would allow the surgeon to interact with the AR whilst maintaining a sterile environment [20] (Figure 3).

  4. 3D printing—model replications of the patient’s kidney can be printed using pre-operative CT/MRIs and these can be used to perform simulated operations prior to placing a knife onto the patient’s skin. SIMPeds 3D Print at the Boston Children’s Hospital offers exactly this—rapid printing and prototyping for nearly any organ in the human body [44]. Examples of this have been used to replicate and operate on difficult paediatric brain tumours [45], facial reconstructions and orthopaedic surgeries amongst many others [46]. This has allowed surgeons to simulate a realistic assessment of the individual’s organ where it can be felt, touched and cut at precise margins. 3D printed surgical planning of partial nephrectomies has been explored by Zhang et al. [46] where face and content validity was obtained by 4 experienced laparoscopic urologists. A pilot study by Silberstein et al. [47] envisioned that 3D models could enhance the surgeon’s (and patient’s) understanding of the individual’s renal malignancy anatomy—this would be especially beneficial in difficulties such as anatomical anomalies and precise segmental artery clamping [48].

Figure 3.

AR of the future. AR involving additional input from live USS imaging, organ trackers and other vital observations and patient data all being fed into the AR headset—providing a hands free platform.

Extrapolating further from this idea, 3D printed kidneys could also allow remote surgical procedures. An idea in conjunction with this was put forward by Dr. Murthi at a recent VR and AR applications gathering at Newseum, Washington, D.C [8]. She foresaw VR and AR working together to support patient care. An initial AR on the patient including imaging and medical data would allow the clinician to assess the patient’s condition (in this case, anatomy) and a remote VR system could allow the clinician to see what the initial AR has shown and consequently advise and provide insight. An augmented nephrectomy system could benefit surgeons where instead of advising, they could operate remotely on replicated 3D models (representing in-vivo kidneys). Initial AR collected locally from the patient could be remotely projected as VR onto a 3D model and a surgeon could perform a partial nephrectomy which could be translated to the real kidney with the help of local robotic da Vinci machines. This model would allow constant feedback between the AR and VR systems and remote operations could be an answer to lack of surgical resources in healthcare deprived areas around the world.

12. Conclusion

Being the 7th most common cancer in the UK [49], renal cancer surgery can really benefit from this emerging symbiotic relationship between surgeon and AR. Since the first uses of AR for partial nephrectomy, there have been reported improvements in the familiarisation of the patient’s anatomy and practicality of AR [38]. However, as Bernhardt et al. discuss in a laparoscopic AR review, there is much advancement to be made in image registration, live tracking and depth perception. Detmer et al. in their review (of AR and VR technology in renal cancers) also highlight the need for resolution of human factors and the need for large scale clinical studies that are currently sparse. In conclusion, recent AR development has worked towards systems aiming to be on par with conventional navigational techniques. Whilst some technology is achieving this in isolation, there still lie barriers of validation, cost evaluation and practical application in the way. In summary, there is much to be achieved before AR systems are precise and safe enough to be integrated into regular clinical practise.

References

  1. 1. Interaction Design Foundation. Augmented Reality—The Past, The Present and The Future [Accessed 01/10/2018]
  2. 2. Mesko B. The Top 9 Augmented Reality Companies in Healthcare. Available from: http://scholar.aci.info/view/156566a01ed00110002/15e38d4c7c500013f25998b
  3. 3. Augmented reality system lets doctors see under patients' skin without the scalpel. Medical Devices & Surgical Technology Week. 2018. pp. 164
  4. 4. Augmented reality helps surgeons to 'see through' tissue and reconnect blood vessels. Medical Devices & Surgical Technology Week. 2018. pp. 213
  5. 5. Wei NJ, Dougherty B, Myers A, Badawy SM. Using Google glass in surgical settings: Systematic review. JMIR mHealth and uHealth. 2018;6(3):e54. DOI: 10.2196/mhealth.9409. Available from: https://www.ncbi.nlm.nih.gov/pubmed/29510969
  6. 6. Moglia A, Ferrari V, Morelli L, Ferrari M, Mosca F, Cuschieri A. A systematic review of virtual reality simulators for robot-assisted surgery. 2016. Available from: http://discovery.dundee.ac.uk/portal/en/theses/a-systematic-review-of-virtual-reality-simulators-for-robot-assisted-surgery(00243bee-f90b-486d-95cd-e0fcb51d8d15).html
  7. 7. Schulz GB, Grimm T, Buchner A, Jokisch F, Casuscelli J, Kretschmer A, et al. Validation of a high-end virtual reality simulator for training transurethral resection of bladder tumors. Journal of Surgical Education. 2018. DOI: 10.1016/j.jsurg.2018.08.001. Available from: https://www.sciencedirect.com/science/article/pii/S1931720418304021
  8. 8. Likowski A. Augmenting Reality in the Operating Room. Available from: https://www.umaryland.edu/news/archived-news/march-2017/newspressreleaseshottopics/augmenting-reality-in-the-operating-room.php [Accessed September 12, 2018]
  9. 9. Jimenez Y, Cumming S, Wang W, Stuart K, Thwaites D, Lewis S. Patient education using virtual reality increases knowledge and positive experience for breast cancer patients undergoing radiation therapy. Supportive Care in Cancer. 2018;26(8):2879-2888. Available from. DOI: 10.1007/s00520-018-4114-4. Available from: https://www.ncbi.nlm.nih.gov/pubmed/29536200
  10. 10. Rai A, Scovell JM, Xu A, Balasubramanian A, Siller R, Kohn T, et al. Patient-specific virtual simulation—A state of the art approach to teach renal tumor localization. Urology. 2018;120:42-48. Available from: doi: 10.1016/j.urology.2018.04.043 Available from: https://www.sciencedirect.com/science/article/pii/S0090429518305405
  11. 11. Makiyama K, Nagasaka M, Inuiya T, Takanami K, Ogata M, Kubota Y. Development of a patient-specific simulator for laparoscopic renal surgery. International Journal of Urology. 2012;19(9):829-835. DOI: 10.1111/j.1442-2042.2012.03053.x
  12. 12. Ueno D, Makiyama K, Yamanaka H, Ijiri T, Yokota H, Kubota Y. Prediction of open urinary tract in laparoscopic partial nephrectomy by virtual resection plane visualization. BMC Urology. 2014;14(1):47. DOI: 10.1186/1471-2490-14-47. Available from: https://www.ncbi.nlm.nih.gov/pubmed/24927795
  13. 13. Gill IS, Kavoussi LR, Lane BR, Blute ML, Babineau D, Colombo JR, et al. Comparison of 1,800 laparoscopic and open partial nephrectomies for single renal tumors. The Journal of Urology. 2007;178(1):41-46. DOI: 10.1016/j.juro.2007.03.038. Available from: https://www.clinicalkey.es/playcontent/1-s2.0-S0022534707005502
  14. 14. Tan H, Norton EC, Ye Z, Hafez KS, Gore JL and Miller DC. Long-term Survival Following Partial vs Radical Nephrectomy Among Older Patients With Early-Stage Kidney Cancer. JAMA. 2012;307(15):1629-1635. Available from: doi: 10.1001/jama.2012.475 Available from: http://dx.doi.org/10.1001/jama.2012.475
  15. 15. Ellison JS, Montgomery JS, Wolf JS, Hafez KS, Miller DC, Weizer AZ. A matched comparison of perioperative outcomes of a single laparoscopic surgeon versus a multisurgeon robot-assisted cohort for partial nephrectomy. The Journal of Urology. 2012;188(1):45-50. DOI: 10.1016/j.juro.2012.02.2570. Available from: https://www.clinicalkey.es/playcontent/1-s2.0-S0022534712029692
  16. 16. Chauvet P, Collins T, Debize C, Novais-Gameiro L, Pereira B, Bartoli A, et al. Augmented reality in a tumor resection model. Surgical Endoscopy. 2018;32(3):1192-1201. DOI: 10.1007/s00464-017-5791-7. Available from: https://www.ncbi.nlm.nih.gov/pubmed/28812157
  17. 17. Gill IS, Aron M, Gervais DA, Jewett MAS. Small renal mass. The New England Journal of Medicine. 2010;362(7):624-634. DOI: 10.1056/NEJMcp0910041. Available from: http://content.nejm.org/cgi/content/extract/362/7/624
  18. 18. Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW, et al. Augmented reality partial nephrectomy: Examining the current status and future perspectives. Urology. 2014;83(2):266-273. DOI: 10.1016/j.urology.2013.08.049. Available from: https://www.clinicalkey.es/playcontent/1-s2.0-S0090429513011333
  19. 19. Edgcumbe P, Pratt P, Yang G, Nguan C, Rohling R. Pico lantern: Surface reconstruction and augmented reality in laparoscopic surgery using a pick-up laser projector. Medical Image Analysis. 2015;25(1):95-102. DOI: 10.1016/j.media.2015.04.008. Available from: https://www.sciencedirect.com/science/article/pii/S1361841515000584
  20. 20. Vávra P, Roman J, Zonča P, Ihnát P, Němec M, Kumar J, et al. Recent development of augmented reality in surgery: A review. Journal of Healthcare Engineering. 2017;2017:4574172-4574179. DOI: 10.1155/2017/4574172. Available from: https://www.ncbi.nlm.nih.gov/pubmed/29065604
  21. 21. Okamoto T, Onda S, Yanaga K, Suzuki N, Hattori A. Clinical application of navigation surgery using augmented reality in the abdominal field. Surgery Today. 2015;45(4):397-406. DOI: 10.1007/s00595-014-0946-9. Available from: https://www.ncbi.nlm.nih.gov/pubmed/24898629
  22. 22. Nakamoto M, Ukimura O, Gill I, Mahadevan A, Miki T, Hashizume M, et al. Realtime organ tracking for endoscopic augmented reality visualization using miniature wireless magnetic tracker. In: Medical Imaging and Augmented Reality. Berlin, Heidelberg: Springer Berlin Heidelberg; 2008. pp. 359-366
  23. 23. Sutherland SE, Resnick MI, Maclennan GT, Goldman HB. Does the size of the surgical margin in partial nephrectomy for renal cell cancer really matter? The Journal of Urology. 2002;167(1):61-64. DOI: 10.1016/S0022-5347(05)65383-9. Available from: https://www.sciencedirect.com/science/article/pii/S0022534705653839
  24. 24. Yip MC, Lowe DG, Salcudean SE, Rohling RN, Nguan CY. Tissue tracking and registration for image-guided surgery. IEEE Transactions on Medical Imaging. 2012;31(11):2169-2182. DOI: 10.1109/TMI.2012.2212718. Available from: https://ieeexplore.ieee.org/document/6264103
  25. 25. Edgcumbe P, Singla R, Pratt P, Schneider C, Nguan C, Rohling R. Augmented Reality Imaging for Robot-Assisted Partial Nephrectomy Surgery. Vol. 9805. Switzerland: Springer International Publishing; 2016. pp. 139-150. Available from: https://link.springer.com/content/pdf/10.1007%2F978-3-319-43775-0_13.pdf
  26. 26. Schneider C, Nguan C, Longpre M, Rohling R, Salcudean S. Motion of the kidney between preoperative and intraoperative positioning. IEEE Transactions on Biomedical Engineering. 2013;60(6):1619-1627. DOI: 10.1109/TBME.2013.2239644. Available from: https://ieeexplore.ieee.org/document/6410004
  27. 27. Detmer FJ, Hettig J, Schindele D, Schostak M, Hansen C. Virtual and augmented reality systems for renal interventions: A systematic review. IEEE Reviews in Biomedical Engineering. 2017;10:78-94. DOI: 10.1109/RBME.2017.2749527. Available from: https://ieeexplore.ieee.org/document/8026164
  28. 28. Altamar HO, Ong RE, Glisson CL, Viprakasit DP, Miga MI, Herrell SD, et al. Kidney deformation and Intraprocedural registration: A study of elements of image-guided kidney surgery. Journal of Endourology. 2011;25(3):511-517. DOI: 10.1089/end.2010.0249. Available from: http://www.liebertonline.com/doi/abs/10.1089/end.2010.0249
  29. 29. Baumhauer M, Simpfendörfer T, Müller-Stich B, Teber D, Gutt C, Rassweiler J, et al. Soft tissue navigation for laparoscopic partial nephrectomy. International Journal of Computer Assisted Radiology and Surgery. 2008;3(3):307-314. DOI: 10.1007/s11548-008-0216-7
  30. 30. Teber D, Guven S, Simpfendörfer T, Baumhauer M, Güven EO, Yencilek F, et al. Augmented reality: A new tool to improve surgical accuracy during laparoscopic partial nephrectomy? Preliminary In vitro and In vivo results. European Urology. 2009;56(2):332-338. DOI: 10.1016/j.eururo.2009.05.017. Available from: https://www.clinicalkey.es/playcontent/1-s2.0-S0302283809005211
  31. 31. Apple's Animoji Will Teach You To Love Face Tracking, For Better or Worse. Available from: http://scholar.aci.info/view/1476b1d538930fb01bd/15e7d4ac4f8000153c16f41
  32. 32. Doerfler A, Cerantola Y, Meuwly J-Y, Lhermitte B, Bensadoun H, Jichlinski P. Ex vivo ultrasound control of resection margins during partial nephrectomy. The Journal of Urology. 2011;186(6):2188-2193. DOI: 10.1016/j.juro.2011.07.100. Available from: https://www.clinicalkey.es/playcontent/1-s2.0-S0022534711045307
  33. 33. Choyke PL, Pavlovich CP, Daryanani KD, Hewitt SM, Linehan WM, Walther MM. Intraoperative ultrasound during renal parenchymal sparing surgery for hereditary renal cancers: A 10-year experience. The Journal of Urology. 2001;165(2):397-400. DOI: 10.1097/00005392-200102000-00010. Available from: https://www.sciencedirect.com/science/article/pii/S0022534705667079
  34. 34. Kang X, Azizian M, Wilson E, Wu K, Martin A, Kane T, et al. Stereoscopic augmented reality for laparoscopic surgery. Surgical Endoscopy. 2014;28(7):2227-2235. DOI: 10.1007/s00464-014-3433-x. Available from: https://www.ncbi.nlm.nih.gov/pubmed/24488352
  35. 35. Cheung CL, Wedlake C, Moore J, Pautler SE, Peters TM. Fused video and ultrasound images for minimally invasive partial nephrectomy: A phantom study. Medical Image Computing and Computer-assisted Intervention: MICCAI … International Conference on Medical Image Computing and Computer-Assisted Intervention. 2010;13(Pt 3):408. Available from: https://www.ncbi.nlm.nih.gov/pubmed/20879426
  36. 36. Singla R, Edgcumbe P, Pratt P, Nguan C, Rohling R. Intra-operative ultrasound-based augmented reality guidance for laparoscopic surgery. Healthcare Technology Letters. 2017;4(5):204-209. DOI: 10.1049/htl.2017.0063. Available from: https://www.ncbi.nlm.nih.gov/pubmed/29184666
  37. 37. Edgcumbe P, Singla R, Pratt P, Schneider C, Nguan C, Rohling R. Follow the light: Projector-based augmented reality intracorporeal system for laparoscopic surgery. Journal of Medical Imaging. 2018;5(2):021216. DOI: 10.1117/1.JMI.5.2.021216. Available from: http://www.dx.doi.org/10.1117/1.JMI.5.2.021216
  38. 38. Ukimura O, Gill IS. Imaging-assisted endoscopic surgery: Cleveland clinic experience. Journal of Endourology. 2008;22(4):803-810. DOI: 10.1089/end.2007.9823. Available from: https://www.ncbi.nlm.nih.gov/pubmed/18366316
  39. 39. Ukimura O, Nakamoto M, Gill IS. Three-dimensional reconstruction of renovascular-tumor anatomy to facilitate zero-ischemia partial nephrectomy. European Urology. 2011;61(1):211-217. DOI: 10.1016/j.eururo.2011.07.068. Available from: https://www.clinicalkey.es/playcontent/1-s2.0-S030228381100978X
  40. 40. Krane LS, Hemal AK. Is indocyanine green dye useful in robotic surgery? Nature Reviews Urology. 2014;11(1):12-14. DOI: 10.1038/nrurol.2013.303
  41. 41. Imperial College London. Imperial's iKnife inspires future doctors. ENP Newswire. 2017
  42. 42. Phelps DL, Balog J, Gildea LF, Bodai Z, Savage A, El-Bahrawy MA, et al. The surgical intelligent knife distinguishes normal, borderline and malignant gynaecological tissues using rapid evaporative ionisation mass spectrometry (REIMS). The British Journal of Cancer. 2018;118(10):1349-1358. DOI: 10.1038/s41416-018-0048-3. Available from: https://www.ncbi.nlm.nih.gov/pubmed/29670294
  43. 43. Murthi S, Varshney A. How augmented reality will make surgery safer. Available from: https://hbr.org/2018/03/how-augmented-reality-will-make-surgery-safer [Accessed 03/10/2018]
  44. 44. Boston Children's Hospital, Simulator Program. SIMPeds 3D Print [Accessed 12/08/2018]
  45. 45. Verge Staff, Boston Children’s Hospital, Simulator Program. A 3D Printed Brain Saved this Toddler’s Life. Available from: http://simpeds.org/news/a-3d-printed-brain-saved-this-toddlers-life/ [Accessed September 12, 2018]
  46. 46. Zhang Y, Ge H-w, Li N-c, Yu C-f, Guo H-f, Jin S-h, et al. Evaluation of three-dimensional printing for laparoscopic partial nephrectomy of renal tumors: A preliminary report. World Journal of Urology. 2016;34(4):533-537. DOI: 10.1007/s00345-015-1530-7. Available from: https://www.ncbi.nlm.nih.gov/pubmed/25841361
  47. 47. Silberstein JL, Maddox MM, Dorsey P, Feibus A, Thomas R, Lee BR. Physical models of renal malignancies using standard cross-sectional imaging and 3-dimensional printers: A pilot study. Urology. 2014;84(2):268-273. DOI: 10.1016/j.urology.2014.03.042. Available from: https://www.clinicalkey.es/playcontent/1-s2.0-S0090429514004555
  48. 48. Chen Y, Li H, Wu D, Bi K, Liu C. Surgical planning and manual image fusion based on 3D model facilitate laparoscopic partial nephrectomy for intrarenal tumors. World Journal of Urology. 2014;32(6):1493-1499. DOI: 10.1007/s00345-013-1222-0. Available from: https://www.ncbi.nlm.nih.gov/pubmed/24337151
  49. 49. Series MB1 Number 47 Cancer Statistics Registrations. Registrations of cancer diagnosed in 2016, England. Key Non Parliamentary Papers 2018

Written By

Keshav Shree Mudgal and Neelanjan Das

Submitted: 29 June 2018 Reviewed: 05 October 2018 Published: 31 December 2018