Open access peer-reviewed chapter

Current Status and Future Perspectives for Augmented Reality Navigation in Neurosurgery and Orthopedic Surgery

Written By

Quentin Neuville, Thierry Scheerlinck and Johnny Duerinck

Submitted: 29 June 2023 Reviewed: 10 July 2023 Published: 19 August 2023

DOI: 10.5772/intechopen.1002344

From the Edited Volume

Applications of Augmented Reality - Current State of the Art

Pierre Boulanger

Chapter metrics overview

134 Chapter Downloads

View Full Metrics

Abstract

Augmented reality (AR) for surgical navigation is a relatively new but rapidly evolving and promising field. AR can add navigation to the surgical workflow in a more intuitive way, improving ergonomics and precision. Neurosurgery has a long tradition in computer-assisted surgical navigation and was the first discipline to use this technology to navigate interventions. Orthopedic surgery is following this path with a wide range of new use cases currently in development. In this chapter, we will describe the evolution of AR as a surgical navigation tool, focusing on application developed for neurosurgery and orthopedic surgery. Based on our own experience, we will also discuss future perspectives and the hurdles to be overcome before the widespread breakthrough of this technology.

Keywords

  • augmented reality
  • AR
  • neurosurgery
  • orthopedic surgery
  • navigation

1. Introduction

Due to limited imaging possibilities in the early days of surgery, large incisions were often made to expose as much normal anatomy as possible for orientation purposes. This led to long and more extensive interventions with steep learning curves. It also resulted in more patient discomfort, a higher likelihood of complications and longer hospital stays. The advent of intra-operative imaging modalities such as fluoroscopy, CT and MRI, but also computer-assisted navigation (CAN) systems, have been milestones towards more accurate and minimally invasive surgery. CAN systems were originally developed for brain surgery, this is why they are often referred to as neuronavigation systems. Nowadays, CAN systems present a wide range of applications in different surgical disciplines, such as neurosurgery, orthopedic surgery, oral and maxillofacial surgery and otolaryngology. However, despite the widespread availability of CAN systems in North America and Europe, on average only 11% of surgeons use it routinely [1]. This is attributed to some fundamental shortcomings including, first of all, the bulkiness of the CAN devices. Current CAN devices take up a lot of space in the operating theater, as they require a computer system, external tracking camera and one or more screens [2]. Secondly, the systems require a time-consuming set up including a patient registration procedure [3]. A third drawback is that the surgeon is typically looking at the screen, which shifts his attention away from the patient and can present a potential hazard when instruments - such as the navigation pointer itself - are held in the operative field [4, 5]. Fourth, the outside-in tracking by an external optical tracking camera (mostly infrared camera) often creates line of sight (LOS) interruption when people or objects in the operating room enter in the line of sight of the camera [6, 7]. The significant cost of these systems might also present a hurdle holding surgeons back from fully adopting computer-assisted navigation for their surgical interventions [8, 9].

Augmented reality is a technology that superimposes computer-generated information or images on a user’s view of the real world. Unlike virtual reality (VR), AR does not occlude the real environment, but rather overlays virtual information on it. Therefore, AR based navigation is a promising technique for open and minimally invasive surgical procedures, as it does not block the surgical field but can project additional data and/or hidden anatomy on top of it. In order to turn an augmented reality device into a navigation system, three additional steps are needed: calibration of the AR device, tracking and image-to-patient registration, which matches the AR models created from pre-operative segmentations of medical data to the intra-operative object [10, 11]. Calibration of the AR device is necessary to determine the AR display coordinate system in relation to the outside world. During tracking, we determine the real-time position and orientation of the patient and surgical instruments in relation to the AR device. It is necessary to maintain good alignment between the virtual and physical environment and allow accurate navigation. Augmented reality (AR)-based navigation has the potential to solve some of the shortcomings of conventional CAN and thus to increase the use of navigation in routine interventions. In addition, AR-based navigation can be complementary to commercially available CAN systems in procedures where CAN is routinely used. The goal of this chapter is to discuss the Augmented Reality (AR) technologies that are developed for neurosurgery and orthopedic surgery, to understand the current evidence regarding their benefit, to consider challenges limiting implementation in clinical practice and to get an idea about the future perspectives of this novel technology.

Advertisement

2. What types of AR-based navigation are currently being developed for neuro- and orthopedic surgery

There is significant variability in the way AR-based navigation is being deployed, although all attempt to merge imaging data to the surgical field. However, not all systems try to achieve this based on calibration, tracking and registration described previously.

2.1 Types of visualizations

There are 3 main techniques of displaying the AR models for surgical guidance: (i) on a head mounted display (HMD) [8, 12, 13] (ii) on a projector [14], and (iii) on an external display [15]. In our opinion, both techniques based on a projector and on external displays such as phone, tablet, screen are suboptimal. They maintain many of the drawbacks of commercially available navigation systems discussed earlier. Therefore, we prefer using an HMD, which is the only method capable of visualizing the AR in 3D on top of the patient’s anatomy. As such, surgeons can view holograms and the surgical field simultaneously, without lag or attention shifts.

2.2 Types of registration

One of the key issues of AR-based surgical navigation is the image-to-patient registration accuracy. Although the requirements depend on the type of surgical intervention, it is essential to maintain an accurate image registration throughout the whole procedure. The most commonly used registration methods for AR in neurosurgery and orthopedics are manual registration, point-based registration and surface registration. During manual registration, the surgeon positions the virtual object visually on top of the physical object. This is the least accurate registration method and it is not suitable for surgeries requiring high accuracy or for minimally invasive procedures. In point-based registration, a specific algorithm looks for the best fit between a set of preoperatively defined points in the image dataset and corresponding anatomic intra-operative points [16]. Surface-based registration is a more sophisticated and accurate version of point-based registration and is performed in two steps [15]. First, a rough initial alignment is obtained using point-based registration methods. In a second step, a more dens point-cloud or surface mesh is obtained and matched to the geometric shapes of the preoperative model using a surface-based alignment process, such as an iterative closest point (ICP) algorithm. Most commercially available navigation systems for neurosurgery and orthopedic surgery use this technique as it provides a practical and accurate alignment. In order to collect intra-operative points, the surgeon needs a tracked pointer i.e., a stylus that can be tracked by the AR device and that allows to define 3D coordinates of points in the real world. In the future, automatic point-cloud or surface mesh collection could be combined with the extraction of anatomical landmarks based on deep learning methods. When performed in real time, this could replace manual point-based registration methods and accelerate the registration process without need for manual interaction [15]. Manual registration can reach accuracies of 4–6 mm while point-based registration and surface-based registration reach accuracies of 2–3 mm [16, 17, 18, 19, 20, 21]. For point- and surface-based registration, the accuracy depends on the number of points collected and the distribution of the collected points relative to the target [22].

2.3 Types of tracking

After registration, the virtual objects and patient anatomy are aligned. However, to maintain the alignment, movements of the patient and/or the AR device must be calculated and compensated for in real-time. During AR-guided surgery, this can be based on the Simultaneous Localization and Mapping (SLAM) tracking system that is integrated in the AR-HMD or on images from other onboard and/or external cameras. Most commercially available AR-HMD’s, including the HoloLens II (Microsoft, Redmond, USA) come with SLAM tracking. This is a self-localization technology, that captures changes in the surrounding with the integrated camera and displays virtual 3D image in a fixed relation to the real world [15]. However, SLAM tracking can result in an important drift of the virtual objects and the tracking accuracy of is too low for surgical use. For example, when the observer moves, the HoloLens has a mean perceived drift between 4.39 and 6 mm [19, 23]. Therefore, several research groups refine SLAM tracking with information of internal or external cameras to track markers in the surgical field. In most cases, these markers are tracked with the RGB camera (e.g., Vuforia). This results in a mean drift of 1.41 mm, which is a 68% improvement compared to SLAM tracking [19]. However, after experimenting with that method, we felt it was impractical and lacked accuracy for medical navigation applications. Tracking based on infrared cameras and retroreflective spheres - which is the method used by most commercially available CAN systems - is more accurate (mean shift 0.809 mm to 2.3 mm and angular errors of 1.038°) [13, 24, 25]. Most research groups use an external IR camera that presents the same shortcomings as commercially available systems. This includes potential LOS interruptions, the need for large and expensive external devices and lack of usability outside the OR. In addition, when using an external camera, the AR device itself must be equipped with markers so that its position can be monitored.

2.4 How we do it

We developed an IR tracking method based on the built-in IR camera of the HoloLens II. This “inside-out” tracking method allows tracking of IR reflective markers, bypassing the need for an external camera (Figure 1). The tracking and registration accuracy of our setup was first evaluated using a phantom head. In total, 20 phantom registrations were performed in an operating room setting. Ten registrations were performed with the AR-HMD, while 10 were carried out using a conventional neuronavigation system (Brainlab Curve 2; Brainlab AG, Munich, Germany). Registration errors remained below 2.0 mm and 2.0° for both AR-HMD based navigation and the conventional neuronavigation, with no significant difference between both systems [18]. Tracking from inside the AR device resolves some important drawbacks of commercially available navigation systems. First, tracking from inside the AR device avoids the need for the AR device to be equipped with a marker frame. Second, this approach prevents external LOS interruption. Third, the AR solution provides a fully integrated, mobile and ergonomic navigation setup that can easily be used in- or outside the operating theater. As such, it has the potential to increase the use of navigation and thus the accuracy of many routine procedures that are currently performed without navigation.

Figure 1.

Tracking of IR reflective markers from the HoloLens II.

Advertisement

3. Applications in neurosurgery and orthopedic surgery

Most data regarding AR-based navigation are based on pre-clinical experiments on phantoms and cadavers or on small proof-of-concept trials. In the field of neurosurgery, experiments for AR-based navigation focused on identifying lesions, guiding tumor resection, guiding cranial biopsy, guiding external ventricular drain insertion and for preoperative planning of skin incisions and craniotomies [2, 15, 18, 26, 27]. In spine surgery, AR is mainly focusing on guiding pedicle screw placement and vertebroplasty [5, 28, 29, 30, 31, 32]. In orthopedics, AR has been described for trauma reconstruction, osteotomy, intramedullary nailing, arthroscopic surgery, oncology, K-wire implantation and arthroplasty [12, 20, 24, 33, 34, 35, 36, 37].

High-quality clinical data on the use of AR-navigation during surgery is currently lacking. Only a few clinical trials have been conducted to examine the accuracy of these systems in clinical practice and/or compare the use of an AR-based navigation system with conventional commercially available navigation systems for neurosurgical or orthopedic procedures [5, 8, 20, 29, 38, 39, 40, 41, 42]. In our view, the main advantage of AR navigation is that it allows introducing navigation in procedures (or specific phases of an intervention) where navigation is currently not used or where it is performing sub-optimally, as for example, with external ventricular drain placement or craniotomy planning.

3.1 Benefits of AR-based navigation

3.1.1 Improved accuracy of the surgical intervention

The introduction of dedicated, ergonomic, easy to use and portable AR-based navigation solutions could promote its use in- or outside the operating theater. As such, it could increase the accuracy of a large number of routine procedures that we perform today without the help of navigation. Nevertheless, many research groups focus on the use of AR-based navigation for interventions that are currently performed with CAN systems. This allows comparing their respective accuracies. Our consortium compared an internally developed AR-HMD based system for intracranial tumor resection planning to the Brainlab Curve 2 system (Brainlab AG, Munich, Germany). Image-to-patient Registration errors remained below 2.0 mm and 2.0° for both AR-HMD based navigation and the conventional neuronavigation, with no significant difference between both systems. However, for delineation of the tumor margins on the patients’ skin in order to plan the incision and craniotomy, the AR-HMD based system was found to be superior in 65% of cases [18].

In a study evaluating AR for spinal pedicle screw placement, Molina et al. placed a total of 113 implants percutaneously (93 pedicle screws and 20 Jamshidi needles) in five cadavers using the XVision AR-HMD (Augmedics Ltd., Chicago, IL). The study reports an overall accuracy of 99.1% (Gertzbein-Robbins Grade A or B; <2 mm pedicle breach) with only one medial pedicle breach in a thoracic vertebra (Gertzbein-Robbins grade C; >2 mm pedicle breach) [9]. They compared their results to the literature of manual and robotic CAN as well as freehanded techniques. Overall, AR navigation was non-inferior to CAN and superior to free-hand procedures. In a proof-of-concept trial, Molina et al. reported the first in-human use of the same system to insert six pedicle screws. A 78-year-old female underwent an L4-S1 posterior lumbar interbody fusion. Clinical accuracy was 100% with a mean linear deviation of 2.07 mm and an angular deviation of 2.41° [8]. In a retrospective study comparing AR-guided and freehanded pedicle screw placement, Elmi-Terander et al. found that the percentages of clinically accurate screws (Gertzbein-Robbins grade A and B) were significantly higher in the AR-guided group compared to the freehand group (accuracies 93.9% and 89.6% respectively (p < 0.05)) [39]. Other trials confirmed that AR-guided pedicle screw placement has a similar accuracy compared to CAN guided pedicle screw placements and an improved accuracy compared to the free-hand technique [5, 20, 29, 41].

Researchers focusing on orthopedic procedures report similar findings during total knee arthroplasty (TKA), total hip arthroplasty (THA) and periacetabular osteotomy. During TKA, Tsukada et al. compared the femoral cut accuracy in 31 patients undergoing an AR-guided procedure, to a cohort of conventional TKA with an intramedullary guide. Using AR navigation, the coronal alignment was more accurate than the intramedullary guide [43]. Another prospective randomized controlled trial compared an AR-based portable hip navigation system to the portable HipAlign (OrthoAlign Inc., Aliso, USA) system to control cup version during THA. These authors report no differences between both systems in terms of cup inclination and only minor differences in cup anteversion [44]. Finally, Kiarostami et al. found that AR guided periacetabular osteotomies were more accurate compared to freehand procedures. The effect was more prominent for less-experienced surgeons [45].

In general, we conclude that, compared to freehanded procedures, AR based navigation systems are more accurate during both, neurosurgical and orthopedic procedures. However, compared to CAN systems, the accuracy is similar as demonstrated during pedicle screw placement as well as total hip and total knee arthroplasties.

3.1.2 Decrease of surgical time and radiation exposure

Augmented reality has the potential to shorten surgical time and reduce radiation exposure during procedures that are traditionally guided by fluoroscopy [32, 46, 47]. Trauma and minimally invasive surgery often require repeated fluoroscopic imaging to guide the intervention. Acquiring adequate and reproducible views comes at the cost of increased surgical time and radiation exposure. For this, Unberath et al. proposed an AR-based solution for C-arm repositioning. In a phantom experiment, mimicking pelvic trauma surgery, repositioning of the C-arm based on AR-guidance led to a significant reduction in surgical time and radiation dose compared to a trial-and-error approach [47]. In another setting, AR navigation of percutaneous vertebroplasty was compared to a fluoroscopy-guided procedure in nine patients. Here, both surgeon radiation exposure and surgical time were reduced, but accuracy also improved [48]. Similar results were found in phantom trials on AR-guided insertion of distal locking screws following intramedullary nailing and AR-guided K-wire insertion [34, 36].

3.1.3 Benefits for teaching

Several researchers also emphasized the importance of AR in teaching and reducing the learning curve for surgical skills acquisition [41, 49]. Our consortium compared the effect of AR-guidance and standardized training on the quality of external ventricular drain placement on a phantom model, by medical students. Our results showed a significantly higher number of good EVD placements (modified Kakarla scale grade 1) resulting from AR guidance, but not from training, in direct comparison to the untrained freehand performance. Training improved the accuracy of EVD placement in the freehand group, but not in the AR-guided group. Another study found that based on AR, medical students were able to place acetabular cups without supervision as accurately as they did when receiving hands-on instruction from an expert [50]. In addition, the technology can be used to visualize the surgical anatomy in 3D and to practice the course of the procedure in advance.

3.2 Limitations

Several barriers have hindered the adoption of AR-HMD based navigation in daily surgical practice. First, because of a lack of high-quality clinical studies, there are concerns about the accuracy of AR-based navigation during a surgical procedure Moreover, AR navigation validated in vitro, could be difficult to implement in clinical practice as strong scialytic light could interfere with AR tracking and visualization. Second, AR-based navigation is still a new technology in surgery and therefore it is currently not well integrated into the clinical workflow. To convince surgeons to use AR-HMDs in their daily practice, specific workflows must be developed integrating this technology within well-established clinical pathways. Third, for each surgical procedure, careful consideration must be given to which steps of the procedure the AR-HMD could be an added value. This will influence how and when it will be implemented, where and when the trackers will be placed, how the image-to-patient registration will be done and what should and should not be visualized. At some stages of the procedure, the surgeon may want to see both 3D segmentations of the patient anatomy and his pre-operative planning. While during other stages, it may be useful to visualize only a planned trajectory or a target to avoid interference of auxiliary information. Therefore, the surgeon needs the possibility to switch between various visualizations. As sterility is an issue, simple voice commands or hand gestures seems most appropriate for this, but it needs specific programming and training. Fourth, concerns do exist about the clarity and contrast of AR images and whether they are disruptive in the surgical setting [51]. The brightness of the AR overlay should be adjustable depending on the ambient light to prevent unintentional blindness. Moreover, overlaid images could distract the surgeon from important surgical events such as bleeding or unexpected objects entering the surgical field. Another important limitation is the fact that most commercially available HMDs – like the HoloLens – are designed to have focal lengths ranging from 2 meters to infinity, while surgeons typically work at a much closer distance. Therefore, the surgeon’s eyes have to accommodate constantly when trying to view the view the physical and virtual world simultaneously [52, 53]. This could be challenging for surgeons wearing glasses and could be overcome by developing dedicated surgical headsets. Currently, several medical companies are working on specific AR-HMDs designed to work at close distances, such as the Magic Leap from Brainlab or the CART 3D from AR Spectra. Finally, multiple studies highlighted the learning curve associated with the introduction of AR-based navigation [50, 54]. However, this should not prevent the surgeon from evaluating and adopting this technology.

Advertisement

4. Workflow: how we do it

4.1 External ventricular drain placement

External ventricular drain (EVD) placement is a routine but life-saving procedure in neurosurgical practice. Placement is most often performed in the emergency department or the intensive care unit using a freehand technique. Despite it being a frequent procedure, the accuracy rate is only around 80%, with complications occurring in up to 40% of the cases [55]. Therefore, our consortium developed a HoloLens II application for AR-guided EVD placement, based on our internally developed inside-out IR tracking software [26]. Respecting the established workflow of EVD placement, we designed semi-automated image processing and image-to-patient registration algorithms, so that the AR application provided a smooth, completely mobile and highly accurate navigation setup to assist in EVD placement. This setup was tested in a phantom experiment to examine the impact of AR guidance on the accuracy and learning curve of EVD placement compared with the freehand technique (Figure 2). Sixteen medical students were randomly allocated to either the freehand technique group or the AR-guided group. Both groups were asked to place 4 EVD’s (left and right on 2 phantom heads). The freehand group had access to pre-operative imaging but afterwards had to place the drain freehanded without guidance. The AR-guided group used the application on the AR-HMD, which provided an overlay of the virtual anatomical model and surgical plan on top of the phantom head. Next, both groups received training on how to place EVD’s and were asked to repeat the same process. In total, 128 EVD’s were placed. Both AR-guidance and training significantly improved the accuracy of the drain placements compared to the untrained freehand placement (Figure 3). The quality of EVD placement as assessed by the modified Kakarla scale (mKS) was significantly impacted by AR-guidance (p = 0.005) but not by training (p = 0.07). Both AR-guided placements (before and after training) (59.4% mKS grade 1 for both) were significantly better than the untrained freehand performance (25.0% mKS grade 1). With AR-guidance, untrained students performed as well as trained students, which indicates that AR guidance not only improved performance but also positively impacted the learning curve [26].

Figure 2.

Here you see the AR overlay provided for EVD placement. There is a Bullseye for the entry at Kocher’s point in red (target) and the tracked EVD within a 2-mm threshold in green. Figure adopted from Van Gestel et al. [26].

Figure 3.

Here you see an example of EVD placement results in the phantom. To appreciate the results, the post-EVD placement CT scan of the phantom is fused with the original brain MRI on which the trajectories were planned. Figures A and B represent freehand performances of a student before and after receiving standardized training respectively. Figures C and D represent AR-guided EVD placements before and after the standardized training respectively. Figure adopted from Van Gestel et al. [26].

4.2 Craniotomy planning

Careful planning of the skin incision, craniotomy and surgical approach are critical to successfully resect an intracranial lesion. Nowadays, CAN systems are indispensable for intracranial tumor resections. After image-to-patient registration, the surgeon uses a tracked pointer to indicate his entry point. Based on the tracked pointer, he then draws the tumor outline on the patient’s skin to serve as a guide for skin incision and craniotomy. However, especially for deep-seated tumors, interpretation errors can lead to important deviations. Our consortium developed a HoloLens II based application to help plan tumor resection. After image-to-patient registration based on a pointer tracked with the inside-out IR-tracking software, the AR-HMD allows displaying the tumor outlines as well as critical structures on top of the patient’s skin. We evaluated the accuracy and efficiency of this system in a prospective clinical trial. Surgeons and trainees with varying degrees of experience delineated the tumor outlines on the patient’s skin, consecutively using conventional neuronavigation and the AR-HMD (Figure 4). In total 20 patients were included in the study. We found that AR-guided tumor delineation was deemed superior in 65% of cases, equally good in 30% of cases, and inferior in 5% of cases when compared to our conventional navigation system (Brainlab Curve 2; Brainlab AG, Munich, Germany). Moreover, the image-to-patient registration and planning of the surgical approach based on the AR-HMD significantly reduced the required time by 39% [18].

Figure 4.

Overview of the proposed AR-based system. (A) Phantom model with a deep-seated lesion (green) with the outlines projected on the skin (green outline) nearest to the tip of the handheld stylus (red dot). The white line illustrates the trajectory between the stylus’ tip and the lesion. It is shown in this picture solely for illustrative purposes. (B) Is a view from the HoloLens II in a clinical case. The patients’ tumor is displayed in AR on the correct position inside his head, together with the outlines orthographically projected on his skin, allowing the surgeon to delineate the tumor margins and plan skin incision (black marker). Figure adopted from Van Gestel et al. [18].

4.3 Hip center of rotation

In total hip arthroplasty (THA), restoring hip biomechanics is key to warrant a good functional result. Restoring the original femoral and acetabular center of rotation (COR) is a first important step to restore the original leg length, muscle tension and abductor level arm [56, 57]. Overall, there is agreement that following THA, the hip rotation center should be positioned within 5 mm of its anatomic location [58, 59]. Achieving that goal in a systematic way remains challenging [60, 61]. As mentioned earlier, the use of navigation has not yet found its way into daily practice of THA surgery, so in most cases, intra-operatively finding and restoring the original hip rotation center relies on surgeon experience. Our consortium developed an application on the HoloLens II to determine and render in AR the functional center of rotation (FCOR) of a phantom hip joint consisting of 20 cadaveric femurs and 3D printed acetabular cup analogues (Figure 5). Both the femurs and acetabular cups were equipped with a tracker. This hip phantom was CT scanned and a segmentation was made in 3D slicer to obtain the ground truth centers of rotation of the femoral heads and the cups. Next, two observers rotated the 20 cadaver femurs twice in its matching 3D printed cup, producing 80 measurements. Based on the displacement of the femoral tracker to the acetabular tracker, the inside-out IR tracking algorithm collected a point cloud (Figure 6). Through a pivot-fitting algorithm, the FCOR was then determined based on this point cloud and visualized in AR on top of the hip phantom. In our phantom trial, determination of FCOR through the proposed AR method resulted in an absolute error of 2.9 ± 1.4 mm and 2.9 ± 1.2 mm for the acetabular cup and femoral head respectively. This FCOR visualized in AR could be used by the surgeon as a guide for femoral neck cut, broaching and prosthetic stem insertion. On the acetabular side, this AR visualization could help guide the reaming depth and cup insertion [62].

Figure 5.

Hip phantom consisting of dried human cadaveric femurs and 3D printed acetabular cup analogues. Both the femurs and acetabular cups are equipped with an IR reflective tracker/marker.

Figure 6.

Point clouds generated by the IR tracking software based on the displacement of the femoral tracker relative to the acetabular tracker. Two observers rotated each femur twice in its matching cup, producing 4 measurements per hip joint.

Advertisement

5. Future perspectives for AR in neurosurgery and orthopedic surgery

AR-based navigation is a hot topic in scientific literature with a rapidly growing number of new systems and use cases. However, the application of AR-based navigation in neurosurgery and orthopedic surgery is still in its infancy. As such, it will require further refinement and validation before it could be widely adopted during routine surgical procedures. For AR-based navigation to break through in clinical practice, some important issues should be addressed. First, user-friendliness plays an important role. The AR-based navigation must be efficiently embedded into the existing well-established surgical workflow of the procedure at hand. Secondly, specific attention must be paid to the desired image-to-patient registration and tracking technologies for each use case. Ideally, we will evolve from the periodic rigid registration techniques used today, to multisensory automatic markerless non-rigid registration techniques. In this way, the information from multiple sensors could be combined and deformation of the patient’s anatomy could be taken-into-account without dead angles, warranting that AR guidance remains accurate throughout the entire surgical intervention. Thirdly, we should focus on the development of dedicated surgical headsets with short working distances and optimal visualizations without delays to reduce visual fatigue. In addition, dynamic user interfaces should be developed that enable users to activate suitable AR information on demand in different phases of a workflow and minimize the interference of AR information with the perception of real surgical situations [63].

In our opinion, the introduction of artificial intelligence and deep learning methods could facilitate the implementation of these suggestions and facilitate the evolution from conventional navigation systems towards AR navigation. The development of smart systems that select the most suitable data from a multi-sensor tracking stream and adjust the registration and navigation in real time would represent a groundbreaking improvement. In a second step, these smart systems could also follow the surgical procedures they are guiding and automatically display and adapt the desired navigation information. Finally, smart AR-based navigation could be combined with sensing technologies to provide direct and personalized feedback to surgical tools when, for example, the surgeon might commit a critical error. Overall, with further technological innovations and clinical validation of the systems, AR-based surgical navigation has the potential to become an indispensable, timesaving, risk-reducing and accuracy-improving technology in surgery.

Advertisement

6. Conclusion

In this chapter, we highlighted different AR technologies currently in development for neurosurgery and orthopedic surgery. Further in vitro and in vivo studies followed by prospective randomized clinical trials are needed to refine and assess efficacy of these systems in daily practice. However, we believe that at the current rate of technological development, AR-based navigation can become an indispensable part of surgical navigation within a few years.

Advertisement

Conflict of interest

The authors declare no conflict of interest.

References

  1. 1. Härtl R, Lam KS, Wang J, Korge A, Kandziora F, Audigé L. Worldwide survey on the use of navigation in spine surgery. World Neurosurgery. 2013;79(1):162-172
  2. 2. Ivan ME et al. Augmented reality head-mounted display-based incision planning in cranial neurosurgery: A prospective pilot study. Neurosurgical Focus. 2021;51(2):E3
  3. 3. Krishnan R, Hermann E, Wolff R, Zimmermann M, Seifert V, Raabe A. Automated fiducial marker detection for patient registration in image-guided neurosurgery. Computer Aided Surgery. 2003;8(1):17-23
  4. 4. Léger É, Drouin S, Collins DL, Popa T, Kersten-Oertel M. Quantifying attention shifts in augmented reality image-guided neurosurgery. Healthcare Technology Letters. 2017;4(5):188-192
  5. 5. Liu A et al. Clinical accuracy and initial experience with augmented reality-assisted pedicle screw placement: The first 205 screws. Journal of Neurosurgery. Spine. 2021;2021:1-7
  6. 6. Sorriento A et al. Optical and electromagnetic tracking Systems for Biomedical Applications: A critical review on potentialities and limitations. IEEE Reviews in Biomedical Engineering. 2020;13:212-232
  7. 7. Hersh A et al. Augmented reality in spine surgery: A narrative review. HSS Journal. 2021;17(3):351-358
  8. 8. Molina CA, Sciubba DM, Greenberg JK, Khan M, Witham T. Clinical accuracy, technical precision, and workflow of the first in human use of an augmented-reality head-mounted display stereotactic navigation system for spine surgery. Operation Neurosurgery (Hagerstown). 2021;20(3):300-309
  9. 9. Molina CA et al. A cadaveric precision and accuracy analysis of augmented reality-mediated percutaneous pedicle implant insertion. Journal of Neurosurgery. Spine. 2020;34(2):316-324
  10. 10. Luebbers H-T et al. Comparison of different registration methods for surgical navigation in cranio-maxillofacial surgery. Journal of Cranio-Maxillo-Facial Surgery. 2008;36(2):109-116
  11. 11. Hong J, Hashizume M. An effective point-based registration tool for surgical navigation. Surgical Endoscopy. 2010;24(4):944-948
  12. 12. Kriechling P, Loucas R, Loucas M, Casari F, Fürnstahl P, Wieser K. Augmented reality through head-mounted display for navigation of baseplate component placement in reverse total shoulder arthroplasty: A cadaveric study. Archives of Orthopaedic and Trauma Surgery. 2023;143(1):169-175
  13. 13. Chen X et al. Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. Journal of Biomedical Informatics. 2015;55:124-131
  14. 14. Wu J-R, Wang M-L, Liu K-C, Hu M-H, Lee P-Y. Real-time advanced spinal surgery via visible patient model and augmented reality system. Computer Methods and Programs in Biomedicine. 2014;113(3):869-881
  15. 15. Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: A review. Physical Medical Biology. 2023;68(4)
  16. 16. Li R, Si W, Liao X, Wang Q , Klein R, Heng P-A. Mixed reality based respiratory liver tumor puncture navigation. Computational Visual Media. 2019;5(4):363-374
  17. 17. Nguyen NQ et al. An augmented reality system characterization of placement accuracy in neurosurgery. Journal of Clinical Neuroscience. 2020;72:392-396
  18. 18. Van Gestel F et al. Neuro-oncological augmented reality planning for intracranial tumor resection. Frontiers in Neurology. 2023;14:1104571
  19. 19. Frantz T, Jansen B, Duerinck J, Vandemeulebroucke J. Augmenting Microsoft’s HoloLens with vuforia tracking for neuronavigation. Healthcare Technology Letters. 2018;5(5):221-225
  20. 20. Gibby JT, Swenson SA, Cvetko S, Rao R, Javan R. Head-mounted display augmented reality to guide pedicle screw placement utilizing computed tomography. International Journal of Computer Assisted Radiology and Surgery. 2019;14(3):525-535
  21. 21. Alp MS, Dujovny M, Misra M, Charbel FT, Ausman JI. Head registration techniques for image-guided surgery. Neurological Research. 1998;20(1):31-37
  22. 22. Widmann G, Stoffner R, Sieb M, Bale R. Target registration and target positioning errors in computer-assisted neurosurgery: Proposal for a standardized reporting of error assessment. International Journal of Medical Robotics. 2009;5(4):355-365
  23. 23. Vassallo R, Rankin A, Chen ECS, Peters TM. Hologram stability evaluation for Microsoft HoloLens. In: Proceedings SPIE 10136, Medical Imaging 2017: Image Perception, Observer Performance, and Technology Assessment. 10 March 2017;10136. DOI: 10.1117/12.2255831
  24. 24. Pietruski P et al. Supporting fibula free flap harvest with augmented reality: A proof-of-concept study. Laryngoscope. 2020;130(5):1173-1179
  25. 25. Meulstee JW et al. Toward holographic-guided surgery. Surgical Innovation. 2019;26(1):86-94
  26. 26. Van Gestel F et al. The effect of augmented reality on the accuracy and learning curve of external ventricular drain placement. Neurosurgical Focus. 2021;51(2):E8
  27. 27. Skyrman S et al. Augmented reality navigation for cranial biopsy and external ventricular drain insertion. Neurosurgical Focus. 2021;51(2):E7
  28. 28. Liebmann F et al. Pedicle screw navigation using surface digitization on the Microsoft HoloLens. International Journal of Computer Assisted Radiology and Surgery. 2019;14(7):1157-1165
  29. 29. Farshad M, Fürnstahl P, Spirig JM. First in man in-situ augmented reality pedicle screw navigation. The Spine Journal. 2021;6:100065
  30. 30. Farshad M et al. Operator independent reliability of direct augmented reality navigated pedicle screw placement and rod bending. The Spine Journal. 2021;8:100084
  31. 31. Felix B et al. Augmented reality spine surgery navigation: Increasing pedicle screw insertion accuracy for both open and minimally invasive spine surgeries. Spine. 2022;47(12):865-872
  32. 32. Abe Y et al. A novel 3D guidance system using augmented reality for percutaneous vertebroplasty: Technical note. Journal of Neurosurgery. Spine. 2013;19(4):492-501
  33. 33. Viehöfer AF et al. Augmented reality guided osteotomy in hallux Valgus correction. BMC Musculoskeletal Disorders. 2020;21(1):438
  34. 34. Londei R et al. Intra-operative augmented reality in distal locking. International Journal of Computer Assisted Radiology and Surgery. 2015;10(9):1395-1403
  35. 35. Cho HS et al. Augmented reality in bone tumour resection: An experimental study. Bone Joint Research. 2017;6(3):137-143
  36. 36. Hiranaka T et al. Augmented reality: The use of the PicoLinker smart glasses improves wire insertion under fluoroscopy. World Journal of Orthopedics. 2017;8(12):891-894
  37. 37. Andress S et al. On-the-fly augmented reality for orthopedic surgery using a multimodal fiducial. Journal of Medical Imaging (Bellingham). 2018;5(2):021209
  38. 38. Molina CA, Dibble CF, Lo S-FL, Witham T, Sciubba DM. Augmented reality-mediated stereotactic navigation for execution of en bloc lumbar spondylectomy osteotomies. Journal of Neurosurgery. Spine. 2021;2021:1-6
  39. 39. Elmi-Terander A et al. Augmented reality navigation with intraoperative 3D imaging vs fluoroscopy-assisted free-hand surgery for spine fixation surgery: A matched-control study comparing accuracy. Scientific Reports. 2020;10(1):707
  40. 40. Elmi-Terander A et al. Pedicle screw placement using augmented reality surgical navigation with intraoperative 3D imaging: A first In-human prospective cohort study. Spine. 2019;44(7):517-525
  41. 41. Dennler C et al. Augmented reality in the operating room: A clinical feasibility study. BMC Musculoskeletal Disorders. 2021;22(1):451
  42. 42. Tsukada S, Ogawa H, Hirasawa N, Nishino M, Aoyama H, Kurosaka K. Augmented reality- vs accelerometer-based portable navigation system to improve the accuracy of acetabular cup placement during total hip arthroplasty in the lateral decubitus position. The Journal of Arthroplasty. 2022;37(3):488-494
  43. 43. Tsukada S, Ogawa H, Nishino M, Kurosaka K, Hirasawa N. Augmented reality-assisted femoral bone resection in total knee arthroplasty. JB JS Open Access [Internet]. 23 Jul 2021;6(3). DOI: 10.2106/JBJS.OA.21.00001
  44. 44. Kurosaka K, Ogawa H, Hirasawa N, Saito M, Nakayama T, Tsukada S. Does augmented reality-based portable navigation improve the accuracy of cup placement in THA compared with accelerometer-based portable navigation? A randomized controlled trial. Clinical Orthopaedics and Related Research. 2023;481(8):1515-1523
  45. 45. Kiarostami P et al. Augmented reality-guided periacetabular osteotomy-proof of concept. Journal of Orthopaedic Surgery and Research. 2020;15(1):540
  46. 46. Chytas D, Malahias M-A, Nikolaou VS. Augmented reality in Orthopedics: Current state and future directions. Frontier in Surgery. 2019;6:38
  47. 47. Unberath M et al. Augmented reality-based feedback for technician-in-the-loop C-arm repositioning. Healthcare Technology Letters. 2018;5(5):143-147
  48. 48. Hu M-H, Chiang C-C, Wang M-L, Wu N-Y, Lee P-Y. Clinical feasibility of the augmented reality computer-assisted spine surgery system for percutaneous vertebroplasty. European Spine Journal. 2020;29(7):1590-1596
  49. 49. Iop A, El-Hajj VG, Gharios M, de Giorgio A, Monetti FM, Edström E, et al. Extended reality in neurosurgical education: A systematic review. Sensors [Internet]. 2022;22(16):6067. DOI: 10.3390/s22166067
  50. 50. Logishetty K, Western L, Morgan R, Iranpour F, Cobb JP, Auvinet E. Can an augmented reality headset improve accuracy of acetabular cup orientation in simulated THA? A randomized trial. Clinical Orthopaedics and Related Research. 2019;477(5):1190-1199
  51. 51. Ha J et al. Opportunities and challenges of using augmented reality and heads-up display in orthopaedic surgery: A narrative review. Journal of Clinical Orthopedic Trauma. 2021;18:209-215
  52. 52. Condino S, Carbone M, Piazza R, Ferrari M, Ferrari V. Perceptual limits of optical see-through visors for augmented reality guidance of manual tasks. IEEE Transactions on Biomedical Engineering. 2020;67(2):411-419
  53. 53. Ferrari V, Carbone M, Condino S, Cutolo F. Are augmented reality headsets in surgery a dead end? Expert Review of Medical Devices. 2019;16(12):999-1001
  54. 54. Fischer M et al. Preclinical usability study of multiple augmented reality concepts for K-wire placement. International Journal of Computer Assisted Radiology and Surgery. 2016;11(6):1007-1014
  55. 55. Huyette DR, Turnbow BJ, Kaufman C, Vaslow DF, Whiting BB, Oh MY. Accuracy of the freehand pass technique for ventriculostomy catheter placement: Retrospective assessment using computed tomography scans. Journal of Neurosurgery. 2008;108(1):88-91
  56. 56. Scheerlinck T. Cup positioning in total hip arthroplasty. Acta Orthopaedica Belgica. 2014;80(3):336-347
  57. 57. Scheerlinck T. Primary hip arthroplasty templating on standard radiographs. A stepwise approach. Acta Orthopaedica Belgica. 2010;76(4):432-442
  58. 58. Liebs TR, Nasser L, Herzberg W, Rüther W, Hassenpflug J. The influence of femoral offset on health-related quality of life after total hip replacement. Bone Joint Journal. 2014;96-B(1):36-42
  59. 59. Jolles BM, Zangger P, Leyvraz P-F. Factors predisposing to dislocation after primary total hip arthroplasty: A multivariate analysis. The Journal of Arthroplasty. 2002;17(3):282-288
  60. 60. Konyves A, Bannister GC. The importance of leg length discrepancy after total hip arthroplasty. Journal of Bone and Joint Surgery. British Volume (London). 2005;87(2):155-157
  61. 61. Renkawitz T et al. Leg length and offset differences above 5mm after total hip arthroplasty are associated with altered gait kinematics. Gait & Posture. 2016;49:196-201
  62. 62. CARS 2023-Computer Assisted Radiology and Surgery. Proceedings of the 37th International Congress and Exhibition Munich, Germany, June 20-23, 2023. International Journal of Computer Assisted and Radiological Surgery. 2023;18(1):1-123
  63. 63. Katić D et al. A system for context-aware intraoperative augmented reality in dental implant surgery. International Journal of Computer Assisted Radiology and Surgery. 2015;10(1):101-108

Written By

Quentin Neuville, Thierry Scheerlinck and Johnny Duerinck

Submitted: 29 June 2023 Reviewed: 10 July 2023 Published: 19 August 2023