Open access peer-reviewed chapter

The 3D Operating Room with Unlimited Perspective Change and Remote Support

Written By

Klaudia Proniewska, Damian Dolega-Dolegowski, Radek Kolecki, Magdalena Osial and Agnieszka Pregowska

Submitted: 20 June 2023 Reviewed: 27 June 2023 Published: 21 July 2023

DOI: 10.5772/intechopen.1002252

From the Edited Volume

Applications of Augmented Reality - Current State of the Art

Pierre Boulanger

Chapter metrics overview

84 Chapter Downloads

View Full Metrics

Abstract

Information and communication technologies combined with extended reality improve diagnostics, medical treatment, and surgical operations courses. Thus, the new generation of devices, which enable displaying of holographic objects, allows visualizing a patient’s internal organs, both anatomical and pathological structures, as interactive 3D objects, based on retrospective 2D images, namely computer tomography (CT) or magnetic resonance imaging (MRI). Remote users can observe ongoing surgery with additional features like standing in front of the operation table, walking around in, and changing the user’s perspective without disrupting the operating doctors. Moreover, the operation can be recorded, then played with additional functionalities—remote pointing and painting, which is important in medical education. The operating surgeon may also ask remotely more experienced operators for consultation, and as a consequence, the number of complications and failed procedures can decrease. It can be also applied to robot surgeries as a first step to remote surgery.

Keywords

  • 3D operating room
  • extended reality
  • computer-integrated surgery
  • image-guided surgery
  • medical education

1. Introduction

Extended reality (XR) includes virtual reality (VR), augmented reality (AR), and mixed reality (MR) [1, 2]. It incorporates the spectrum of technologies, which allow you to combine and/or mirror the real world (i.e. the physical world) with the “digital twin world” providing the possibility of iterating among others. The computer-generated images and objects are presented in front of the user’s eyes with head-mounted displays (HMDs), which provide a hand-free view of virtual objects, like text and images [3]. Most of the commercially available HMDs used optical see-through (OST) to enable the users to perceive the world through the set of optical components. Usually displays contain optical components like half mirrors, birdbaths, free-form prism, and optical waveguides. Since XR technology enables the superimposition of two-dimensional (2D) and three-dimensional (3D) objects, it has recently been applied in various fields of life, in particular in medicine, with special emphasis on all types of surgery, including image-guided surgery (IGS) and computer-integrated surgery (CIS), while it has the potential to improve the safety and efficiency of medical procedures [4, 5]. More efficiently planned surgical interventions may result in shorter recovery times and better treatment outcomes. Thus, the transmission to less invasive surgery requires the development of visualization techniques without limited perspective.

The XR-based solution can help surgeons integrate 2D images obtained from ultrasound, magnetic resonance imaging (MRI), or computed tomography (CT) in the DICOM format (digital imaging and communication data in medicine) with the 3D operation view through overlay virtual objects over a surgical field. It enables the enrichment of the operating field with computer-generated digital images, especially in the visualization of tumors and anatomical structures [6]. And in this way, the application of the 3D anatomical information in preoperative planning, and creates the possibility of integrating the preoperative model with the intra-operative scenario and guides the surgeon in real time [7, 8, 9]. The advantage of systems based on XR technology in comparison to traditional auxiliary displays is the fact that the surgeons do not have to change their line of sight between the operating scene and the auxiliary display, which significantly reduces the operating time. Another XR application possibility in surgery is connected with the support or even replacement of the surgeon apprenticeship by high-fidelity surgical simulators [10]. However, implementation in the surgeon’s workflow is hampered by a lack of clinically useful application development requirements.

Here, we proposed a 3D operating room with unlimited perspective change and remote support, which is based on Microsoft HDMs—HoloLens 2, and Intel RealSense cameras. The basic idea of the system is shown in Figure 1. The system allows the users (surgeons and/or medical students) to interact with the headset using a verbal command or a simple gesture such as hand movements or eye movements. The proposed approach is completely sterile and can be successfully implemented in clinical procedures. It also enables the operating surgeon to share his field of vision (headset vision) with other users (surgeons or medical students), which can be located in any place without disturbing the workflow. Moreover, it is possible to change the position, and angle of the medical procedure observation as well as subsequent playback and analysis of the recorded procedure from different perspectives, which is of great importance in the training of future medical staff. We also overview available XR-based solutions in surgery, and we show the lines of XR technology development in the field of surgery.

Figure 1.

The scheme of XR-based support system for surgeons.

Advertisement

2. Extended reality-based technologies in surgery

In recent years, surgery has undergone significant technological advancement, especially in the field of the application of computer vision, like image-guided surgery and computer-integrated surgery) [11]. The development of extended reality allows surgeons to visualize medical data to assist in the execution of both complex and routine medical procedures. It enables the incorporation of the operation field and anatomical landmarks with navigation guidance [12]. Thus, the XR-based solution may contribute to the improvement of surgical procedures from the point of a surgeon’s view in comparison to the classical one. When it comes to XR, surgery can be divided into two application areas, namely, preoperative planning, and intraoperative support [4]. However, it is required to be proven to be safe and reliable in order to be applied in the operating rooms.

2.1 Preoperative planning

The crucial part of all types of surgery is preoperative planning. The factors like medical history, surgical past, possible comorbidities, medications, and physical conditions must be taken into account [13]. Since extended reality, which has a wide spectrum, which is oscillating through virtual, augmented, and mixed realities enables 3D visualization, can be used for surgery preoperative planning to provide insight into the subject’s anatomy. Thus, the construction of the 3D preoperative model required the processing of the CT or/and MRI scan images. Next, these scan images are segmented, mostly manually, but also automatically, for example, using algorithms based on artificial intelligence (AI) [14, 15]. On their basis, a three-dimensional image of the organ and its abnormalities, arteries, veins, and other anatomical structures is developed and then rendered. The final phase is exporting the 3D model to HDMs.

Initial results from the use of XR have been reported for virtually every branch of surgery, in particular, preoperative 3D visualization is profitable in plastic and craniomaxillofacial surgery, where the result very often determines the patient’s decision about surgery [16, 17]. VR-based [18, 19] solutions (PulmoVR) help to correct the lung-tumor placement in 52.00% of cases, and even in 10.00% of cases contributed to the patient’s preservation of the lung despite prior indications for lung removal. PulmoVR creates the lung digital twin based on CT-scan images in front of the surgeries eyes. The structures like airways, arteries, veins, and lungs segment are visualized. The system enables the visualization of those parts of the patient’s lungs that the doctor is interested in. Another XR-based solution, in opposition to HMDs, that uses 3D displays as 3D TVs is Echopixel, which allows visualization of arteries in patients with pulmonary atresia.

Other issues of the XR application in the medical sector are connected with the subjects-doctor communications. It is especially important in preoperative planning, while it enables better understanding by the patient of the future treatment (the essence of the operation and its course) and possible complications that may occur after the procedure, which affects the patient’s trust in the doctor. Many procedures do not end only with the resection of the diseased tissue, very often it also requires pathological and psychological rehabilitation. Raising awareness is also crucial for the future effectiveness of targeted therapy, and here XR-based technologies give a huge opportunity [20].

2.2 Intraoperative support

Trauma surgeons are often faced with complex situations that keep track of X-ray procedures. Improved 3D imaging would provide both, an increase in the efficiency of the medical procedure, and reduce the time of exposure to X-rays of the medical team [12, 21]. Thus, the extended reality has a huge potential to become an intraoperative support for surgeons by visualization and superimposition of CT or MRI image scans collected before surgery on the patient, thus improving navigation during surgery [22]. The intraoperatively 3D model/objects of the desired anatomical structures can be displayed in HDMs [23]. To visualize the surgeon’s field of view, a camera is needed to accurately monitor the distance and angle of the object of interest. Some proposed solutions also include accelerometers, optical sensors, GPS, and gyroscopes as well as navigation systems [24]. One can say that XR-based technologies are the interface between surgeons and computer-generated objects, which are based on the medical documentation of subjects.

Intraoperative navigation became an essential part of complex medical procedures. This navigation can use 3D objects obtained from previous medical records on the part of the patient of interest during the medical procedure. For example, in ref. [25], the augmented reality-assisted navigation system (ARAN) was shown. It enables real object positioning according to the information contained in the CT image scans. It turned out that XR-based approaches enable obtaining better clinical output in knee arthroplasty. XR systems can also enable effective navigation during more demanding operations, including heart and brain surgeries while information concerning spatial comprehension and often the precise localization of structures (like deep-seated tumors) is required [26, 27, 28]. The AR-supported surgical navigation platform DEX-Ray was proposed in ref. [26]. It enables the visualization of the tumor and surrounding structures and the anatomy of the venous, including critical draining veins. The system also assisted in determining the optimal scalp flap and bone window for surgical access by selection transparency of the scalp and bones. It was tested during the resection of meningiomas in the falcine, convexity, and parasagittal regions [29]. In ref. [30], the DEX-Ray system was combined with mixed reality deceive—Microsoft HoloLens.

Moreover, minimally invasive surgery (MIS) technology has been developed to limit access to wound injuries and reduce the incidence of postoperative complications [31]. An interesting proposition is also to combine the XR technologies with robotic-assisted surgery (RAS) [32, 33]. In this case, the XR is responsible for the user interfaces, which allow for the reduction of the phenomenon of looking away and increases the situational awareness of the user. Another example is application of the augmented reality to enhance endoscopic video during in-vivo robot-assisted radical prostatectomy (RARP) [34]. The solution enables putting a virtual 3D object of the patient’s prostate on top of its 2D counterpart in real-time.

2.3 Medical education

Extended reality is also increasingly used in the field of education, especially when it comes to medical education. For example, the Stanford Virtual Heart Project, in which doctors use VR for visualization and understanding of congenital heart defects [The Stanford Virtual Heart—Stanford Medicine Children’s Health (stanfordchildrens.org)]. It visualizes the normal and abnormal anatomy of the heart. First, it was aimed at families of children with heart defects but later spread to students as well. It allows users to see the anatomy of the heart, and the blood flow inside it and observed how the defect interferes with the proper functioning of the organ. Virtual heart is also used to visualize the medical procedures conducted by pediatric heart surgeons to repair the not correctly functioning heart. Another solution is developed by Case Western Reserve University and Cleveland Clinic, a HoloAnatomy application tailored to run on Microsoft HoloLens devices to learn human anatomy (https://engineering.case.edu/HoloAnatomy-honors). The effectiveness of HoloAnatomy was analyzed during medical students’ courses [35, 36] [HoloAnatomy® and MR in the Jagiellonian University Medical College (JUMC) https://mrame.cm-uj.krakow.pl/], see Figure 2. HoloAnatomy® Suite takes advantage of these technical benefits and provides students with an opportunity to learn anatomy in a completely new way. The heart and soul of every HoloAnatomy® lesson is a holographic slideshow. It consists of accurate, three-dimensional models with a controlled number of anatomical structures. There is also a possibility to add other educational materials to the slideshow. What is important is the teacher and students see the model in the same place. This enables the tutor to point or magnify any structure from the model he/she wants to show. All things considered, HoloAnatomy® not only allows students to better understand anatomy, especially topographical relations between anatomical structures but also maintains an ability to communicate with the tutor and interact with the model. This course aimed to prepare JUMC students to use the latest diagnostic technology, the so-called extended reality, thanks to which they will achieve unique competencies valued in the labor market. In addition, the aim was to improve competencies and professional qualifications through participation in specialized training and study visits tailored to the needs of students. Training and visits are addressed to JUMC students. During the semester program, students will gain new competencies thanks to the synergy of several activities:

  1. Specialized training in medical data visualization methods using XR technology;

  2. Access to audiovisual materials in the field of techniques and methods of using XR (e.g., in cardiology, neurology, etc.);

  3. Participating in study visits to employers using pictorial tools for visualizing medical data, e.g., XR.

Figure 2.

XR support in medical courses, (a) the view person without HDMs, (b) the view of the HDMs users, and (c) conducting study visits for students of Collegium Medicum of the Jagiellonian University. Students have the opportunity to familiarize themselves with the visualization of medical data using extended reality—advanced imaging as a tool used to optimize pretreatment planning and intraprocedural monitoring in clinical environments.

Students qualified for the program gained access to the latest technology (software and hardware). The training was conducted by highly qualified staff with experience in both medical diagnostics and augmented reality technology used in medicine. The JUMC didactic staff have developed syllabuses and training scenarios, as well as will evaluate the competencies of students qualified for the project. Thanks to the availability of audiovisual materials, students could additionally improve their skills. Study visits of students to employers (hospitals, diagnostic imaging laboratories, etc.) located in Cracow, Poland were an integral part of the program. Participation of students in the project will improve their competencies and professional qualifications (gaining new knowledge and skills and using it in practice), group work, and problem-solving skills (group cooperation, data analysis).

The preoperative virtual exercises are important for the effective training of medical staff. They allow you to shorten the training time, increase the level of its results and reduce costs. XR not only allows you to visualize anatomical structures in a more realistic and accessible way but also engages the student more than books and autopsies [37]. Thanks to the XR application, the users can undergo training in a time frame adapted to their needs, without stress related to the possibility of making a mistake resulting in permanent damage to patients’ health or even their death. In addition, without restrictions related to the subjects’ time under anesthesia, consumption of blood and plasma and other substances necessary during the operation, costs of the operating room, and participation of other staff. The main XR advantage is the ability to make mistakes and correct them with helpful hints. An interesting proposal is also the implementation of XR technology for medical exercises of deep space mission crews who have to deal with various health disorders in space [38].

Long-term comparative observations were also carried out between medical students using XR technology for learning and students learning traditionally [39]. It turned out that the first group of students was characterized by greater determination in pursuing a career and better grades. In ref. [40], it was shown that XR-based technologies have a positive influence on medical students by the increase topic interest, focus, and motivation.

2.4 Relation with patients: mutual understanding and anxiety management

However, not only doctors, future doctors, and other medical staff can benefit from the implementation of XR technology in practice. Sometimes it is very difficult to find a common ground of communication between the patient and the doctor [41]. Attractive and, as far as possible, real visualization of a medical procedure can effectively move mutual communication to a higher level of understanding. Thus, XR can make it easier for the patient to assimilate and understand the information provided by the doctor regarding the disease itself and the process of its treatment. It contributes to anxiety reduction, which is connected with the medical procedure in all stages, namely, preoperative, intraoperative, and postoperative [42, 43]. XR has been shown to reduce stress in patients but does not affect objective measurements of patients’ physiological status [44].

Advertisement

3. Three-dimensional operation room

In this paper, the XR-based support sterile system for surgeons in the operating rooms was designed and implemented. The proposed solution is presented in Figures 1 and 3. The operating surgeon may share his field of headset vision with other users, which may be located. Users can change the position, and angle of the medical procedure observation as well as subsequent playback and analysis of the recorded procedure from different perspectives, without impact on the work of the operating surgeon presented in Figure 4. The Intel RealSense cameras were applied to provide XR streaming from 3D cameras to HoloLens 2 glasses.

Figure 3.

A set-up environment for The 3D operating room with unlimited perspective change and remote support. The operating room is captured by a 3D camera divided into infrared, depth, and color cameras. Three-dimensional-captured images can be transferred to extended reality HDMs.

Figure 4.

Three-dimensional visualization in surgery involves the use of advanced imaging techniques and intraoperative imaging, to create detailed 3D representations of the surgical field e.g., 3D ultrasound with real-time data transfer to XR. Surgeons can then view these images on specialized displays or through extended reality headsets, enhancing their depth perception and understanding of anatomical structures.

The XR streaming, which is shown in Figures 35, is carried out as follows:

  1. Step 1. The gathering of 3D data by Intel RealSense cameras. The cameras required PC connections to reduce the size of the system and mobility issues (fewer cables due to the requirement only of the source of power like a power bank), in our case, the cameras are connected to a Nvidia Jetson microcomputer. When cameras catch data for 3D processing, they will pass it to the Nvidia device, which collects data and performs basic transition and smoothing for future processing. Data are locally stored at the connected SSD drive. When recording and processing of data are finished, data is sent to local Network Area Storage (NAS). Note, that one camera cannot cover the entire space/room. The application of the 2–3 pairs of cameras with Nvidia microcomputer is needed. Each of them will record data at the same time from a different position and place (different perspectives). As a result, few recordings will be achieved.

  2. Step 2. Automatic process of combining recordings registered by Intel RealSense camera into one large 3D recording. Recordings, which are registered in Step 1, will be combined into one recording. For this purpose, the computationally powerful PC, and the load stored on NAS data (recordings) will be combined in one 3D large-size recording of the considered scene.

  3. Step 3. Streaming in XR-based devices. A separate application is needed. It will load created earlier recordings and render from it 3D view according to data in it. This will be next streamed to XR-based HDMs, in our case HoloLens 2. While a large number of data and limited performance of HDMs entire process of 3D image rendering needs to be done on an external PC and streamed over WiFi to HoloLens 2 glasses. At the same time while streaming HoloLens 2 glasses will be sent to these PC, the actual position of HDMs in space and moves to allow the computer to render and stream the next view for the users.

Figure 5.

In a traditional operating room, surgeons typically rely on two-dimensional displays to visualize the surgical site. However, with the advent of advanced imaging technologies and XR, 3D operating rooms have emerged to enhance surgical precision and improve patient outcomes.

Thus, remote support in surgery involves the use of telemedicine technologies to enable remote collaboration and consultation. Surgeons can connect with experts or specialists who are not physically present in the operating room, allowing for real-time guidance, knowledge sharing, and collaboration, see Figures 68.

Figure 6.

Adjustable perspective refers to the ability to change the viewing angle or vantage point within a surgical environment. This can be achieved through the use of original assisted surgery systems or advanced imaging technologies that allow surgeons to support the extended view of the surgical field.

Figure 7.

Remote 3D medical consultation refers to the practice of utilizing advanced technologies to provide medical consultations and expertise remotely, with the added benefit of three-dimensional visualization.

Figure 8.

Healthcare professionals can use video conferencing tools to communicate with healthcare providers remotely. In this case extended reality overlays virtual elements onto the real-world environment, providing an enhanced view of the medical data. In the context of remote consultations, XR can be used to display three-dimensional medical images, such as CT image scans or MRI data, directly onto the patient’s body or relevant objects. This allows for a more accurate assessment of the medical condition and facilitates real-time discussions between healthcare professionals.

Advertisement

4. Extended reality-based application for surgeons

Designing the XR applications is fundamentally different from designing applications intended for flat screens of computers or tablets, while the user interaction is different. Extended reality has more control over processes and their design. The designed application must take into account the way the user communicates with HDMs. For example, HoloLens 2 glasses work with human senses, and human preferences vary from person to person. While HoloLens 2 enables gestures, eye movements, or voice commands control, this must be taken into account during the design process. Visual control is still underestimated and the combination of voice, hand gestures, and gazes can create incredibly fluid experiences with contextual menus appearing and disappearing as the user looks at something significant. This will become even truer when eye tracking becomes the standard in this area. A big challenge in XR-based application design is the 3D user interfaces, while developing a 3D graphical interface that engages the user visually and has emotional significance is an important part of the design process, especially in medical aspects. The designing and implementation of the XR-based application, which will be customized to the Microsoft HoloLens 2 glasses, to display the 3D large-size recording are as follows:

  1. Step 1. Segmentation of Dicom data (Figure 9). Every CT or MRI image scans contain a lot of unneeded data, which slows down device performance during the display of the model. Due to that segmentation process is required to leave only important parts of these data. This also allows the creation of separate models for bones, tissue, or even entire organs. It can be achieved by for example the open-source 3Dslicer application (https://www.slicer.org/).

  2. Step 2. Validation and error removal (Figure 10). Segmented and saved models also require validation, which can be done using 3D graphic tools like the open-source Blender application (https://www.blender.org/). The main purpose is to eliminate duplicated layers on models or broken parts of the surface. Such errors can stop the model from being correct display in the application.

  3. Step 3. Application engine configuration (Figure 11). In the case of HDMs like Microsoft HoloLens 2, the simplest way to create an application is to use a 3D creation tool (engine) such as Unreal Engine (https://www.unrealengine.com/en-US) or Unity (https://unity.com). Each of them requires a precise configuration allowing the application to get the best performance and quality. Several configuration settings can be set to improve the speed of the engine and the number of frames generated during the application run.

  4. Step 4. Importing models, assigning capabilities, and manipulation (Figure 12). One of the last steps during the creation of a simple application is to import a segmented model and assign to it its capabilities. This means that displayed by the engine model will have actions and reactions. For example, to allow the model to be moved we need to activate a change of location on it. If the user wants to move and manipulate the position of the model it also requires to add to it reactions for hand gesture input. For example, if there will be a grab gesture, performed model should be moving with the user’s hand until such gesture is stopped.

  5. Step 5. Packaging application for the device. The final part is to assign development certification, and application details and pack the application into a file that any Microsoft HoloLens 2 glasses will be able to simply and quickly use for installation. The process of packing application is very automated but includes the generation of all shadows and graphics of the model so it can take very long.

Figure 9.

The segmentation process is done in 3Dslicer.

Figure 10.

Model validation using blender.

Figure 11.

Project configuration in unreal engine for Microsoft HoloLens 2 device.

Figure 12.

Importing model and configuring its data.

After performing the above steps finally, the installation file is ready and can be distributed and installed on any Microsoft HoloLens 2 device.

Advertisement

5. Case study description

The purpose of the presented case report is to establish a novel approach to preoperative patient experience and pre and intraoperative planning using extended reality 3D visualization of MRI or/and CT DICOM file segmentations. For patient experience, radiology can be complicated for patients to understand and fully appreciate, so we propose that seeing their 3D hologram, including segmented structures and highlighting the lesion would reduce stress levels, and increase relatability and personal engagement with treatment. Concerning pre and intraoperative utility, these segmentations visualize and differentiate between portal, venous, and arterial vasculature in one model, highlighting structures and the lesion all to simplify planning. Because the XR model is projected using see-through Microsoft HoloLens 2 goggles, the surgical team can wear them while seeing both the sterile field and hologram. Additionally, they can interact with the hologram without exiting or compromising the sterile field.

This XR-based hologram was created using a preoperative MRI image scan in 3D Slicer, an open-source platform for the analysis and visualization of medical images. After the segmentation process presented in Figure 13, which was validated by a radiologist on staff for accurately labeling the lesion and visible vessels, it was presented to the patient and surgical staff via the Microsoft HoloLens 2 the day before surgery. The objectives of this study were two-fold. First was proved that showing patients a 3D visualization of radiological scans provides unique benefits that help calm, inform, and engage with their care. Second, was sought to explore both the utility of using this extended reality visualization for preoperative planning and the practicality of being an intraoperative reference.

Figure 13.

The segmentation procedure (a) segmentation and volume rendering, including medical MRI image scan. The type of view was denoted by colors: Horizontal red bar, coronal green bar, and sagittal yellow bar, (b) magnified segmented object.

Medical staff reported that the proposed method of visualization reduced the mental effort required in of keeping track of structures and vasculature between slices, demonstrating utility in the preoperative planning stage. They also commented that because the Microsoft HoloLens 2 goggles do not obstruct the field of view like other XR devices, this technique aids intraoperatively by serving as a reference point that can be readily accessed at any time without exiting or compromising the sterile field. An example of such a medical procedure is shown in Figure 14. Thus, the main challenge is the time required to complete a segmentation for visualization. The algorithms applied using 3D Slicer have been fully validated and automate much of the procedure, however, manual edits are required to complete a segmentation. The algorithms function by tracking voxel intensity between slices and creating a continuous extrapolation in 3D. However, there will be an associated mastery curve to reduce that time as the individual performing segmentations gain familiarity with 3D Slicer. On the other hand, the XR-based visualization of radiological scan segmentations using the Microsoft HoloLens 2 shows excellent promise in the field of surgery for both patient and operator experience. Patients can be more engaged with their care, have a greater understanding of their condition, and decreased preoperative anxiety. A surgeon can quickly and easily review a segmentation for surgical planning and later actively use it as a reference intraoperatively without leaving the sterile field.

Figure 14.

System validation in the operating 3D room with the unobstructed field of view while wearing the Microsoft HoloLens 2 (a) view of the operator who is viewing the model (b) comparison of a real organ in the operating room and a digital organ displayed using XR-based device.

Advertisement

6. Discussion and conclusion

The proposed XR-based solution enables the preview and registration of performed medical procedures without restrictions resulting from the camera settings and the recorded perspective, i.e. unlimited perspective. It provides the support of surgeons, which are performing a complex medical procedure by other specialists in a given field located in a different location and have much more experience and knowledge concerning the performed procedure. From the perspective of the surgeons’ improvement process, it is a huge advantage, because they gain a chance to gain more experience and skills in procedures that they do not perform usually. Moreover, they gain access to consultations and support from specialists to whom they do not have access before due to the distance and location of a given medical facility (e.g. smaller cities away from large specialist hospitals by up to hundreds of kilometers). Surgeons who are not proficient in more complex procedures can count on the support of more experienced specialists who can help with an unexpected course of the procedure, e.g., when the operated lesion covers a larger area than originally assessed and the question arises about the best form of cutting/removing the lesion in this case. Many areas of medicine, including diagnostics of neurological and psychiatric diseases, rehabilitation after strokes, and advanced research on the human brain, will undoubtedly benefit from the introduction of XR-based solutions. Also, the patients receive benefits in the form of opportunities to take new treatment options.

An important limitation of the XR-based system application is changes that occur in the human eye as a result of the aging process, including age-related presbyopia, inability to focus on close objects, resulting in blurred vision, and natural aging hinders accommodation, while the majority of surgeons is middle age or older [45]. The last one, which is related to the eye lens adjusting to the distance, is especially important in the application of HDMs being near-eye displays. Other limitations of the wide application of XR in medical practice are limitations of the HDMs themselves with their availability constraints limiting scalability and financial outlays related to their purchase [46]. Further sustainable development of XR technology is needed, balancing its technical parameters with costs, with a special emphasis on experimental validation [47].

Moreover, to provide a realistic user experience occlusion handling, especially in AR-based solutions, and proper rendering of objects are key issues [48]. A depth-based approach could extend the perception by capturing table depth data in real-time, for example, the presentation of the industrial scene in the form of a sparse point cloud, and conversion to the depth image [49]. Also, the development of 5G/6G edge computing and cloud servers contributes to the improvement of XR technologies [50].

An important research element is connected with human cognitions, namely explaining the relationship between user perception and the use of XR technology [51] by adjusting and analyzing users’ personal experiences that are achieved through an awareness of the presented context. The challenge is being able to control the user’s perception of XR relative to their perception of the real world [52].

Advertisement

7. Future plans

In the proposed solution, still, some technical issues connected with the quality of visualization, reduction of latencies, lighting defects, and orientation may be improved. In the future, we plan to evaluate the usefulness of the proposed system in surgical practice taking into account depth perception, quality of rendering, and visibility of anatomical landmarks, text, and objects, with the combination of the System Usability Scale (SUS) with the nontechnical skills (NOTECHS) rating scale for surgical teams [53, 54], and Mayo High-Performance Teamwork Scale (MHPTS) [55], which have been proposed for the evaluation of virtual surgical simulations in terms of an individual patient sample. The application of the Surg-TLX and others to the evaluation of human cognitions is also considered [56, 57].

Thus, the development of XR technology in medicine should be coordinated with the needs of healthcare in terms of security, precision, and reliability, taking into account the protection of personal data (sensitive information) as well as the development of XR-based devices. Improved XR-based technologies are also needed to enhance 3D Visualization. Because the quality and resolution of 3D imaging technologies will likely improve, providing even more detailed and accurate representations of the surgical field. This could include advancements in imaging modalities, such as higher-resolution CT image scans or real-time 3D ultrasounds, enabling surgeons to visualize anatomical structures with exceptional clarity.

Another important aspect is to improve connectivity and communication infrastructure and telecommunication technologies, which contribute to seamless and real-time communication between surgeons, remote experts, and other healthcare professionals. High-speed connections and low-latency transmission are crucial for transmitting large amounts of data, including high-resolution 3D images and video feeds, enabling efficient remote support.

In the future is also be the development and implementation of XR-based technologies and artificial intelligence-based algorithms as a part of the integrated system. It can be added value as an aid in surgical planning, real-time image analysis, and decision support. AI-based tools could help surgeons navigate complex anatomical structures, identify anomalies or critical regions, and provide insights based on vast amounts of medical data. Integrating AI into the 3D operating room with remote support could enhance surgical precision and outcomes.

Advertisement

Acknowledgments

This study was supported by the National Centre for Research and Development (Grant Lider No. LIDER/17/0064/L-11/19/NCBR/ 2020).

Advertisement

Conflict of interest

The authors declare no conflict of interest.

Advertisement

Notes/thanks/other declarations

This study was approved by the Medical Ethical Committee of the Jagiellonian University Medical College in Cracow, Poland No: 1072.6120.27.2020 and 1072.6120.92.2022.

References

  1. 1. Kwok AOJ, Koh SGM. COVID-19 and extended reality (XR). Current Issues in Tourism. 2021;24:1935-1940. DOI: 10.1080/13683500.2020.1798896
  2. 2. Xi N, Chen J, Gama F, Riar M, Hamari J. The challenges of entering the metaverse: An experiment on the effect of extended reality on workload. Information Systems Frontiers. 2023;25:659-680. DOI: 10.1007/s10796-022-10244-x
  3. 3. Fang W, Chen L, Zhang T, Chen C, Teng Z, Wang L. Head-mounted display augmented reality in manufacturing: A systematic review. Robotics and Computer-Integrated Manufacturing. 2023;83:102567. DOI: 10.1016/j.rcim.2023.102567
  4. 4. Zhang J, Lu V, Khanduja V. The impact of extended reality on surgery: A scoping review. International Orthopaedics. 2023;47:611-621. DOI: 10.1007/s00264-022-05663-z
  5. 5. Dey A, Billinghurst M, Lindeman RW, Swan JE 2nd. A systematic review of 10 years of augmented reality usability studies: 2005 to 2014. Frontier in Robot AI. 2018;5:37. DOI: 10.3389/frobt.2018.00037
  6. 6. Acidi B, Ghallab M, Cotin S, Vibert E, Golse N. Augmented reality in liver surgery. Journal of Visceral Surgery. 2023;160:118-126. DOI: 10.1016/j.jviscsurg.2023.01.008
  7. 7. Wang J, Suenaga H, Liao H, Hoshi K, Yang L, Kobayashi E, et al. Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation. Computerized Medical Imaging and Graphics. 2015;40:147-159. DOI: 10.1016/j.compmedimag.2014.11.003
  8. 8. Malhotra S, Halabi O, Dakua SP, Padhan J, Paul S, Palliyali W. Augmented reality in surgical navigation: A review of evaluation and validation metrics. Applied Sciences. 2023;13:1629
  9. 9. Gregory TM, Gregory J, Sledge J, Allard R, Mir O. Surgery guided by mixed reality: Presentation of a proof of concept. Acta Orthopaedica. 2018;89:480-483. DOI: 10.1080/17453674.2018.1506974
  10. 10. Lungu AJ, Swinkels W, Claesen L, Tu P, Egger J, Chen X. A review on the applications of virtual reality, augmented reality and mixed reality in surgical simulation: An extension to different kinds of surgery. Expert Review of Medical Devices. 2021;18:47-62. DOI: 10.1080/17434440.2021.1860750
  11. 11. Alotaibi YK, Federico F. The impact of health information technology on patient safety. Saudi Medical Journal. 2017;38:1173-1180. DOI: 10.15537/smj.2017.12.20631
  12. 12. Lex JR, Koucheki R, Toor J, Backstein DJ. Clinical applications of augmented reality in orthopaedic surgery: A comprehensive narrative review. International Orthopaedics. 2023;47:375-391. DOI: 10.1007/s00264-022-05507-w
  13. 13. Flood LM. Complications in otolaryngology–head and neck surgery. In: Bernal-Sprekelsen M, Carrau RL, Dazert S, Dornhoffer JL, Peretti G, Tewfik MA, editors. Thieme. Vol. 127. 2013. pp. 1043-1043. DOI: 10.1017/S0022215113002211
  14. 14. Janssen BV, Theijse R, van Roessel S, de Ruiter R, Berkel A, Huiskens J, et al. Artificial intelligence-based segmentation of residual tumor in histopathology of pancreatic cancer after neoadjuvant treatment. Cancers. 2021;13:5089
  15. 15. Li X, Guo Y, Jiang F, Xu L, Shen F, Jin Z, et al. Multi-task refined boundary-supervision U-Net (MRBSU-Net) for gastrointestinal stromal tumor segmentation in endoscopic ultrasound (EUS) images. IEEE Access. 2020;8:5805-5816. DOI: 10.1109/ACCESS.2019.2963472
  16. 16. Farronato G, Galbiati G, Esposito L, Mortellaro C, Zanoni F, Maspero C. Three-dimensional virtual treatment planning: Presurgical evaluation. The Journal of Craniofacial Surgery. 2018;29:e433-e437. DOI: 10.1097/scs.0000000000004455
  17. 17. Mespreuve M, Waked K, Collard B, De Ranter J, Vanneste F, Hendrickx B. The usefulness of magnetic resonance angiography to analyze the variable arterial facial anatomy in an effort to reduce filler-associated blindness: Anatomical study and visualization through an augmented reality application. Aesthetic Surgery Journal Open Forum. 2021;3:ojab018. DOI: 10.1093/asjof/ojab018
  18. 18. Bakhuis W, Sadeghi AH, Moes I, Maat APWM, Siregar S, Bogers AJJC, et al. Essential surgical plan modifications after virtual reality planning in 50 consecutive Segmentectomies. The Annals of Thoracic Surgery. 2023;115:1247-1255. DOI: 10.1016/j.athoracsur.2022.08.037
  19. 19. Pratt P, Ives M, Lawton G, Simmons J, Radev N, Spyropoulou L, et al. Through the HoloLens™ looking glass: Augmented reality for extremity reconstruction surgery using 3D vascular models with perforating vessels. European Radiological Experiment. 2018;2:2. DOI: 10.1186/s41747-017-0033-2
  20. 20. Zhang Y, Lu Y, Li J, Huang B, He X, Xiao R. Exploring the clinical benefits of mixed-reality technology for breast lumpectomy. Mathematical Problems in Engineering. 2023;2023:2919259. DOI: 10.1155/2023/2919259
  21. 21. Ochs BG, Gonser C, Shiozawa T, Badke A, Weise K, Rolauffs B, et al. Computer-assisted periacetabular screw placement: Comparison of different fluoroscopy-based navigation procedures with conventional technique. Injury. 2010;41:1297-1305. DOI: 10.1016/j.injury.2010.07.502
  22. 22. Klopfer T, Notheisen T, Baumgartner H, Schneidmueller D, Giordmaina R, Histing T, et al. Next step trauma and orthopaedic surgery: Integration of augmented reality for reduction and nail implantation of tibial fractures. International Orthopaedics. 2023;47:495-501. DOI: 10.1007/s00264-022-05619-3
  23. 23. Proniewska K, Khokhar AA, Dudek D. Advanced imaging in interventional cardiology: Mixed reality to optimize preprocedural planning and intraprocedural monitoring. Kardiologia Polska. 2021;79:331-335. DOI: 10.33963/kp.15814
  24. 24. Tagaytayan R, Kelemen A, Sik-Lanyi C. Augmented reality in neurosurgery. Archives of Medical Science. 2018;14:572-578. DOI: 10.5114/aoms.2016.58690
  25. 25. Bennett KM, Griffith A, Sasanelli F, Park I, Talbot S. Augmented reality navigation can achieve accurate coronal component alignment during total knee arthroplasty. Cureus. 2023;15:e34607. DOI: 10.7759/cureus.34607
  26. 26. Mongen MA, Willems PWA. Current accuracy of surface matching compared to adhesive markers in patient-to-image registration. Acta Neurochirurgica. 2019;161:865-870. DOI: 10.1007/s00701-019-03867-8
  27. 27. Nguyen NQ , Cardinell J, Ramjist JM, Lai P, Dobashi Y, Guha D, et al. An augmented reality system characterization of placement accuracy in neurosurgery. Journal of Clinical Neuroscience Official Journal of the Neurosurgical Society of Australasia. 2020;72:392-396. DOI: 10.1016/j.jocn.2019.12.014
  28. 28. Qi Z, Li Y, Xu X, Zhang J, Li F, Gan Z, et al. Holographic mixed-reality neuronavigation with a head-mounted device: Technical feasibility and clinical application. Neurosurgical Focus. 2021;51:E22. DOI: 10.3171/2021.5.Focus21175
  29. 29. Low D, Lee CK, Dip LLT, Ng WH, Ang BT, Ng I. Augmented reality neurosurgical planning and navigation for surgical excision of parasagittal, falcine and convexity meningiomas. British Journal of Neurosurgery. 2010;24:69-74. DOI: 10.3109/02688690903506093
  30. 30. Jain S, Gao Y, Yeo TT, Ngiam KY. Use of mixed reality in neuro-oncology: A single centre experience. Life. 2023;13:398
  31. 31. Ashrafian H, Clancy O, Grover V, Darzi A. The evolution of robotic surgery: Surgical and anaesthetic aspects. British Journal of Anaesthesia. 2017;119:i72-i84. DOI: 10.1093/bja/aex383
  32. 32. Qian L, Wu JY, DiMaio SP, Navab N, Kazanzides P. A review of augmented reality in robotic-assisted surgery. IEEE Transactions on Medical Robotics and Bionics. 2020;2:1-16. DOI: 10.1109/TMRB.2019.2957061
  33. 33. Chen Z, Marzullo A, Alberti D, Lievore E, Fontana M, De Cobelli O, et al. FRSR: Framework for real-time scene reconstruction in robot-assisted minimally invasive surgery. Computers in Biology and Medicine. 2023;163:107121. DOI: 10.1016/j.compbiomed.2023.107121
  34. 34. Tanzi L, Piazzolla P, Porpiglia F, Vezzetti E. Real-time deep learning semantic segmentation during intra-operative surgery for 3D augmented reality assistance. International Journal of Computer Assisted Radiology and Surgery. 2021;16:1435-1445. DOI: 10.1007/s11548-021-02432-y
  35. 35. Pregowska A, Osial M, Dolega-Dolegowski D, Kolecki R, Proniewska K. Information and communication technologies combined with mixed reality as supporting tools in medical education. Electronics. 2022;11:3778
  36. 36. Kolecki R, Pręgowska A, Dąbrowa J, Skuciński J, Pulanecki T, Walecki P, et al. Assessment of the utility of mixed reality in medical education. Translational Research in Anatomy. 2022
  37. 37. Peterson DC, Mlynarczyk GS. Analysis of traditional versus three-dimensional augmented curriculum on anatomical learning outcome measures. Anatomical Sciences Education. 2016;9:529-536. DOI: 10.1002/ase.1612
  38. 38. Burian BK, Ebnali M, Robertson JM, Musson D, Pozner CN, Doyle T, et al. Using extended reality (XR) for medical training and real-time clinical support during deep space missions. Applied Ergonomics. 2023;106:103902. DOI: 10.1016/j.apergo.2022.103902
  39. 39. Gan W, Mok TN, Chen J, She G, Zha Z, Wang H, et al. Researching the application of virtual reality in medical education: One-year follow-up of a randomized trial. BMC Medical Education. 2023;23:3. DOI: 10.1186/s12909-022-03992-6
  40. 40. Kumar A, Srinivasan B, Saudagar AKJ, AlTameem A, Alkhathami M, Alsamani B, et al. Next-gen Mulsemedia: Virtual reality haptic simulator’s impact on medical practitioner for higher education institutions. Electronics. 2023;12:356
  41. 41. Leclercq WK, Keulers BJ, Scheltinga MR, Spauwen PH, van der Wilt GJ. A review of surgical informed consent: Past, present, and future. A quest to help patients make better decisions. World Journal of Surgery. 2010;34:1406-1415. DOI: 10.1007/s00268-010-0542-0
  42. 42. Kyaw BM, Saxena N, Posadzki P, Vseteckova J, Nikolaou CK, George PP, et al. Virtual reality for health professions education: Systematic review and Meta-analysis by the digital health education collaboration. Journal of Medical Internet Research. 2019;21:e12959. DOI: 10.2196/12959
  43. 43. Wilkat M, Karnatz N, Schrader F, Schorn L, Lommen J, Parviz A, et al. Usage of object matching algorithms combined with mixed reality for enhanced decision making in orbital reconstruction—a technical note. Journal of Personalized Medicine. 2023;13:922
  44. 44. Eijlers R, Dierckx B, Staals LM, Berghmans JM, van der Schroeff MP, Strabbing EM, et al. Virtual reality exposure before elective day care surgery to reduce anxiety and pain in children: A randomised controlled trial. European Journal of Anaesthesiology. 2019;36:728-737. DOI: 10.1097/eja.0000000000001059
  45. 45. Lee HJ, Drag LL, Bieliauskas LA, Langenecker SA, Graver CJ, O'Neill J, et al. Results from the cognitive changes and retirement among senior surgeons self-report survey. Journal of the American College of Surgeons. 2009;209(5):668-671.e662
  46. 46. Labovitz J, Hubbard C. The use of virtual reality in podiatric medical education. Clinics in Podiatric Medicine and Surgery. 2020;37:409-420. DOI: 10.1016/j.cpm.2019.12.008
  47. 47. Daher M, Ghanimeh J, Otayek J, Ghoul A, Bizdikian A-J, El Abiad R. Augmented reality and shoulder replacement: A state of the art review article. JSES Reviews, Reports, and Techniques. 2023. DOI: 10.1016/j.xrrt.2023.01.008
  48. 48. Fang W, Hong J. Bare-hand gesture occlusion-aware interactive augmented reality assembly. Journal of Manufacturing Systems. 2022;65:169-179. DOI: 10.1016/j.jmsy.2022.09.009
  49. 49. Li W, Wang J, Liu M, Zhao S. Real-time occlusion handling for augmented reality assistance assembly systems with monocular images. Journal of Manufacturing Systems. 2022;62:561-574. DOI: 10.1016/j.jmsy.2022.01.012
  50. 50. Maddikunta PKR, Pham Q-V, Prabadevi B, Deepa N, Dev K, Gadekallu TR, et al. Industry 5.0: A survey on enabling technologies and potential applications. Journal of Industrial Information Integration. 2021;26:100257
  51. 51. Wang Z, Bai X, Zhang S, Billinghurst M, He W, Wang Y, et al. The role of user-centered AR instruction in improving novice spatial cognition in a high-precision procedural task. Advanced Engineering Informatics. 2021;47:101250. DOI: 10.1016/j.aei.2021.101250
  52. 52. Bernard F, Bijlenga PP. Defining anatomic roadmaps for neurosurgery with mixed and augmented reality. World Neurosurgery. 2022;157:233-234. DOI: 10.1016/j.wneu.2021.09.125
  53. 53. Brooke JB. SUS: A ‘Quick and Dirty’ usability scale. Paper/Poster presented. 1996
  54. 54. Sevdalis N, Davis R, Koutantji M, Undre S, Darzi A, Vincent CA. Reliability of a revised NOTECHS scale for use in surgical teams. American Journal of Surgery. 2008;196:184-190. DOI: 10.1016/j.amjsurg.2007.08.070
  55. 55. Malec JF, Brown AW, Leibson CL, Flaada JT, Mandrekar JN, Diehl NN, et al. The mayo classification system for traumatic brain injury severity. Journal of Neurotrauma. 2007;24:1417-1424. DOI: 10.1089/neu.2006.0245
  56. 56. Fischer M, Fuerst B, Lee SC, Fotouhi J, Habert S, Weidert S, et al. Preclinical usability study of multiple augmented reality concepts for K-wire placement. International Journal of Computer Assisted Radiology and Surgery. 2016;11:1007-1014. DOI: 10.1007/s11548-016-1363-x
  57. 57. Brown EJ, Fujimoto K, Blumenkopf B, Kim AS, Kontson KL, Benz HL. Usability assessments for augmented reality head-mounted displays in open surgery and interventional procedures: A systematic review. Multimodal Technologies and Interaction. 2023;7:49

Written By

Klaudia Proniewska, Damian Dolega-Dolegowski, Radek Kolecki, Magdalena Osial and Agnieszka Pregowska

Submitted: 20 June 2023 Reviewed: 27 June 2023 Published: 21 July 2023