Open access peer-reviewed article

Metaverse in Ophthalmology: The Convergence of Virtual and Physical Space in Eye Care

David Benet

Oscar J. Pellicer-Valero

This Article is part of Healthcare information management Section

Article metrics overview

495 Article Downloads

Article Type: Scoping Review

Date of acceptance: October 2022

Date of publication: November 2022

DoI: 10.5772/dmht.10

copyright: ©2022 The Author(s), Licensee IntechOpen, License: CC BY 4.0

Download for free

Table of contents


Introduction
6G and the five must-have categories for the metaverse
Metaverse: challenges and opportunities
Conclusion
Acknowledgments
Declarations of interest

Abstract

We all live in a hybrid world of both online and offline experiences. Especially since the start of the COVID-19 pandemic in 2020, we are now more connected than ever. The aim of the metaverse, which is made up of the terms “meta” which means “beyond”, and “verse” which comes from the word “universe”, is to simplify these means of communication by minimising inconveniences and improving experiences in the physical world. Simultaneously, the multitude of data that is part of our lives is moving us towards an irreversibly digital future.\nData are the raw material that feeds machine learning and artificial intelligence algorithms, which allow us to make decisions based on the analysis of historical events, and to predict future behaviour. In addition to this, 6G, the sixth generation of hyper-speed mobile connectivity, together with new models of cloud computing, will allow for disruptive developments economy, machine learning, social analytics, blockchain, and health, among many others. Digital transformation is already part of our lives, and the health sector and the therapeutic field of ophthalmology are no exceptions.\nNew technologies based on metaverse are emerging to improve medical education and training as well as processes and procedures in all stages of patient journey from diagnosis, monitoring, surgical procedures and adherence to medical treatment. We are facing a “virtual life” that is evolving amidst social and ethical challenges. Will the metaverse really allow the virtual and physical space to come together? Will it improve patient healthcare in the field of ophthalmology?

Keywords

  • metaverse

  • artificial intelligence

  • augmented reality

  • mixed telepresence

  • ophthalmology

  • 6G

Author information

Introduction

The term “metaverse” was coined by Neal Stephenson in his science fiction book, Snow Crash [1]. It refers to the next generation or “embedded” internet based on a network of interconnected virtual experiences that combine the virtual and physical worlds, providing new ways of working, learning, playing, socialising and creating.

The metaverse has been experiencing an unprecedented growth in the last few months, especially since Facebook CEO, Mark Zuckerberg, revealed his proposal to create a metaverse in July 2021, and in November 2021, when he announced that Facebook would become Meta [2].

The terms Virtual Reality (VR) and Augmented Reality (AR) are commonly thought of as the metaverse, but the concept is not so simple. We can expand the concept of the metaverse to a three-dimensional (3D) virtual environment that works in the same way as the real world in terms of social, relational and economic activity.

To date, these visual environments based on an immersive projection have been growing in popularity. However, they have had relatively little impact on human-computer interaction or on user-to-user communication and collaboration models as they have focused on the areas of healthcare and videogames.

Implementation on a global scale is beginning to see the light of day. The evolution of 6G wireless transmission technology will enable an unprecedented rollout, as this evolution will solve both the system scalability problem resulting from current communication models that do not efficiently allow for collaboration between metaverse portals, and distributed computing that currently raises doubts about whether architectures can really support a virtual world. It can be seen that most of the use cases that 6G meets are the future scenarios planned in the metaverse. These scenarios also put forward communication performance requirements that exceed 5G in an all-round way, and 6G is required to determine the upgrade standard.

The COVID-19 pandemic, which began in 2020, forced companies from various industries and sectors around the world to adopt digital technologies and online-enabled forms of communication. Working from home has become a necessary solution to keep up with working dynamics in companies, and it has been the main reason why video conferencing platforms, such as Zoom Video Communications, Google Meet and Microsoft Teams, have become popular. However, these platforms do not provide an efficient or attractive experience, unlike face-to-face interaction. Interaction is limited to real time audio and video on a screen. A metaverse-based platform would allow a more appealing form of interaction between participants due to the immersive virtual environment.

The objective of this article is to point out the importance of the future concept called Total Experience (TE) as a future business strategy in all sectors including healthcare, especially in ophthalmology, that integrates employee experience, customer experience, user experience, and, as a novelty, multi-experience across many touchpoints with the objective to drive more outstanding customer trust, loyalty, and advocacy by providing total experience management. The combination of all the “Experiences” will mark the potential to improve business or health results, thereby accelerating growth.

A study has been recently started to further explore how to implement the metaverse in ophthalmology by applying holographic construction and emulation, and virtuality-reality integration and interconnection. In order to put it into better practice, the study suggests to broaden the concept by adding comprehensive perception to the holographic construction, adding intelligent processing to the holographic emulation, and quality control to the virtuality-reality integration, as well as human–computer integration, thus realising the maxim “simplification of complex problems, digitalization of simple problems, programming of digital problems, and systematization of programming problems” [3].

This approach can overcome the constraint that internet-based healthcare and telemedicine platforms hardly play an active role in smaller hospitals, especially those in rural villages and towns. Moreover, it will facilitate graded diagnosis and treatment, and contribute to transforming the current handicraft workshop model, which varies between doctors and hospitals with uneven levels, into a modern assembly-line model that meets national and even international standards.

The metaverse application in the medical field based on VR and AR technologies will enable everyone to use digital avatars for face-to-face communication in the virtual world, providing a 3D interface for display and interaction and enabling us to become immersed in a virtual world of information. In this 3D space, interaction can be through body movements, language, gestures, and gaze. The 3D interface for display and interaction is the fundamental setting in such applications, and the superstructure of metaverse will undoubtedly undergo revolutionary changes, including drastic expansion of its application scenarios [3, 4].

From the patient’s perspective and taking into account the patient journey in all ophthalmological diseases, and in the retina field, in particular, the metaverse will help from the awareness of eye disease, early detection, diagnosis, and prognosis, to the monitoring and adherence to treatment in the medium and long term.

6G and the five must-have categories for the metaverse

6G is essential to build the metaverse. On the one hand, 6G is the next generation advanced mobile communications system that will function as a distributed neural network and provide communication links to merge the physical world with the virtual world. 6G will mark the start of a new era in all forms of intelligence, everything will be hyper-connected, and it will lay the foundations for the concept “Intelligence of Everything” [5, 6].

Five main categories of usage are currently defined (figure 1), among which eMBB+ (enhanced Mobile BroadBand), URLLC+ (ultra-reliable low-latency communications) and mMTC+ (massive machine-type communications) are extensions and combinations of the usage scenarios defined in 5G, while sensing and AI are two new usage scenarios that will be born in the 6G era. The objective envisioned for URLLC+ is to efficiently support mission-critical scenarios, especially for Internet of Things (IoT). Considering the requirements of high throughput and massive connectivity in mMTC, URLLC traffic usually coexists with eMBB services for the data-intensive cases. This is necessary in the health sector to strike a balance between URLLC and eMBB, maximizing eMBB data rate and URLLC reliability [7, 8].

Figure 1.

The five must-have categories for the metaverse.

If we look at each category in detail:

eMBB+ [6] is the continued evolution of eMBB that will enable high data rate and user data performance, as well as a high system capacity that will allow massive connections, enabling an extremely immersive experience and multi-sensory interactions in applications such as AR, VR, mixed reality (MR), extended reality (XR) and telepresence [7].

AR is a technology that combines the real and digital worlds. AR uses computer vision to recognise real-world surfaces and objects using technology such as object recognition, plane detection, facial recognition, and motion tracking, among others. From there, the computer overlays its own computer-generated data such as messages, images, sounds and graphics on these previously recognised planes. Doctors are focusing on new techniques to perform complex surgeries with accuracy, and AR is starting to be used for these procedures, similar to how robot aided surgery has been used to execute complex procedures with efficiency and adaptability for some time [9].

In ophthalmology, there are many emerging applications of AR, especially in the area of low vision, and for people suffering from retinitis pigmentosa (RP). RP is a disease that causes individuals to lose their peripheral vision, leading to problems in low-light situations. The disease cannot be corrected with glasses or lenses. In this case, the patient can use AR glasses that scan the surrounding space to detect objects in front of the user. It then instantly marks these objects in bright colours, which can be better perceived by individuals with RP, and thus helping them to understand better, the space they are in.

VR was first described in science fiction, and emerged in real life via an immersive film-viewing cabinet created in the 1950s. VR is a technology that substitutes vision of the physical world with a representation of scenes and images of objects produced by a computer system, making them feel as if they were real. VR provides a sensory immersion in a new world, a virtual world based both on real and non-real environments which have been artificially generated and can be perceived thanks to VR glasses and accessories (mainly headphones or audio headsets, among others). While technological barriers and a lack of content have so far prevented mass adoption, VR may emerge as the next generation platform for communication, displacing our need for physical travel, and easing related energy consumption [9].

In the field of health, VR can take the user inside the human body, allowing to observe and interact with areas that are often inaccessible. Patients can also virtually walk through a VR simulation of their anatomy and pathology, and be guided through their surgical plan, allowing them to have a better grasp of their treatments. In the field of ophthalmology, VR is already in use for visual field testing through an oculo-kinetic VR test, in which patients look directly at any new visual stimulus they can see.

MR is a hybrid experience between AR and VR, as it incorporates aspects of the real and digital worlds. Although MR is a technology that is mainly used to combine both worlds, the most attractive part of the technology is that, it is used for realistic interaction between users and digital elements. The idea is to generate a 3D model of reality and superimpose virtual information onto it. This way both realities can be combined to add additional content of value to the MR user [9].

In the field of ophthalmology, new 3D models and MR holograms are starting to be used as models for physical eyeballs, as previous models offered rather poor quality and detail, both anatomically and physiologically. Using MR, students, research fellows and ophthalmologists can study the eye more efficiently by immersing themselves, with the use of holograms, in a physical and digital world to provide increasingly real pedagogical experiences. Soon, training based on real patient cases will be part of this new MR, thus creating a high-level immersive experience, with the obvious advantages it would bring, as well as a greater engagement than with face-to-face models or current 2D application-based models.

XR is a new collective term that encompasses all immersive technology and is understood to be the perfect merging of the virtual and real world, creating a fully immersive experience, and thus increasing our perception of reality. XR is expected to become widespread within the next years, as it will be a significant component of the metaverse environment. All of these are the main technologies that will drive the metaverse [1012].

Finally, telepresence, the technology that allows a person to be “transported” from one physical space to another through a telecommunications network, allows individuals to visit and experience places as if they were actually there, giving them a virtual presence.

URLLC+ is the continued evolution of URLLC for critical machine-like communications in Industry 4.0 (robots, drones or unmanned aerial vehicles and new human-machine interfaces), which will enable real-time interaction between collaborative robots, and massive machine learning and knowledge sharing between robots. It will have unprecedented future applications in healthcare and ophthalmology [7, 8].

mMTC+ represents the continued evolution of mMTC and is characterised by a large number of connected devices with sporadic traffic in smart cities, buildings, transport, manufacturing, agriculture and healthcare. The required data rate is in the low to medium range [7, 8, 11].

Sensing and communication will be integrated into a single system in 6G. It will allow the physical and/or real world to be viewed, and digital avatars/twins to be created in the digital world. This category will enable the creation of a new scenario of uses ranging from device-based or device-free target localisation, imaging, reconstruction, monitoring of the environment, and gesture and activity recognition.

In the area of healthcare, in particular in specialties where the use of imaging is key (such as oncology and ophthalmology), ultra-resolution will be key in imaging applications where the probability of detection and/or diagnosis becomes the top priority.

The AI scenario will be fundamental in 6G, which will use an architecture that must enable massive machine learning in a distributed and collaborative manner. Therefore, the “Network AI” concept aims to intelligently connect distributed intelligent agents to proliferate large-scale rollout of AI across all industries, where highly efficient transmission, high capacity, native security and low-latency should enable real-time AI usage [12].

In 6G intelligent context-aware networks, deployment, operation, and energy usage will be minimized subject to the information gathered from sensing and localization without human intervention. The terahertz (THz) frequency imaging and spectroscopy, for example, will support real-time and perpetual data on the human body through non-intrusive, contact-free, and dynamic measurements in digital health technology [12].

In ophthalmology, ophthalmologists make decisions regarding diagnosis, monitoring and treatment of several ocular diseases based on their clinical characteristics, tests performed with multiple devices used in the specialty and the noticeable differences between a healthy eye and the different stages of a diseased eye. AI is a growing discipline that could potentially free ophthalmologists from these monotonous tasks [12, 13]. Furthermore, the metaverse would enable new forms of doctor-patient interaction, complementing and facilitating decision making and the patient journey for each ocular disease, as well as new and more efficient ways of training and life-long learning which would transform the way we live, interact, learn, evolve and work [1416].

Metaverse: challenges and opportunities

Native AI in 6G will feature a network design and air interface, as well as a machine learning system to implement personalised optimisation and automated operations and maintenance [4]. Each element of the 6G network will natively integrate communication, computation and sensing capabilities, facilitating the evolution of cloud-centric intelligence.

They will also provide high-resolution sensing, localisation, imaging and environment reconstruction capabilities to improve communication performance and support a wider range of network service scenarios, laying the foundations for building a smart digital world. Extreme 6G connectivity will provide a data density increase, nearly ten times that of 5G connections, centimetre-level localisation, millimetre-level imaging and system reliability, thus enabling exponential acceleration of digital transformation globally [17].

The 6G network will integrate various capabilities, such as communication, sensing, computing and intelligence, and therefore the current network architecture needs to be redefined, as the new network will need to flexibly adapt to tasks such as collaborative sensing and distributed learning for the development of AI applications on a global scale.

The data, as well as the knowledge and intelligence derived from these data, will be the driving force behind the redesign of the 6G network architecture. Another key point is sustainability, and that is why the concept of ecological design and native AI capacity will be introduced, with the aim that 6G could improve the overall energy efficiency of the network. The ambition behind 6G is for all areas of life on Earth to be connected in an intelligent, sustainable and efficient manner.

In healthcare, the aging population of the world can present with vision disturbances, considered the most important sense for humans. It is estimated that diabetic retinopathy (DR) will affect 300 million individuals by 2025 and 500 million by 2050 [13], with up to 30% of patients with diabetes presenting with some grade of DR, and 10% suffering from DR that threatens their vision. Similarly, age-related macular degeneration (AMD) is the main cause of vision loss in older patients in developed countries; as life expectancy increases, so does the incidence of advanced AMD [13]. Although AI-related research on the eyes is broad, it currently focuses on the detection and stratification of diseases from images of the fundus of the eye or optical coherence tomography (OCT) using convolutional neural networks (CNN), achieving expert-level performances in many cases [13]. In the future, it will be possible to automatically classify all ocular diseases using an image-based AI system. Furthermore, in a second phase, we will probably be able to complement these systems with additional patient information, including, among other information, demographic data, medical records, comorbidities and genetic markers.

Breaking barriers, getting in early, learning and supporting the growth of the metaverse will improve brand credibility and will buy allegiance from the communities growing within the metaverse.

In the metaverse, the ophthalmologists’ 3D avatars will have space to collaborate with tools, such as digital whiteboards, or 3D models of the eye of the patient, and provide efficient and advanced training in the virtual world. It will also allow for face-to-face meetings without any complex conferencing equipment. It will be possible to assess the patient and access the machines used for diagnosis and treatment in the virtual world. Likewise, the procedures will be safely tested on the digital avatars, which will allow errors and vulnerabilities to be identified before they are carried out in a physical environment or in the real world.

The vision described by Samia Rizk is more ambitious and futuristic, although not improbable [11]. Rizk describes the example of a hospital application which could involve creating a digital copy of a specific hospital process including inpatient flow, root cause identification, and different interventions can be tested before being used, by carrying out advanced analyses and running millions of potential scenarios.

In the field of ophthalmology, and through the metaverse, we can consider virtual units for detection, diagnosis, referral, treatment and management of the disease in the short, medium and long term, as well as the possible scenarios of ocular diseases, and in particular, for each retinal disease, with the possibility of running potential scenarios to identify the different causes, solutions and the best alternatives available now and in the future.

It is important to highlight some examples that denote the advantages of the new solutions provided by the metaverse in health, and specifically in the area of ophthalmology, compared to the classic approach methods, throughout the patient’s journey for each of the ophthalmological pathologies.

Education

In the field of general population education, studies to date indicate that both VR and AR can be effective for eye health education. In glaucoma for example, patients significantly improved their understanding of their disease and the importance of eye examination after using these immersive tools in EyeCU, while enabling scalable remote health education through a virtual consultation or remote assessment [18, 19].

For medical professionals, applications of VR and AR in education include training simulators for surgery, ophthalmoscopy, and optometry. The most common simulated surgical procedure is cataract surgery, followed by vitreoretinal procedures [19, 20], along with other corneal and retinal surgeries as well as treatment for glaucoma [21].

Diagnostics

In terms of diagnosis, both AR and VR can help patients understand how vision loss can affect function and quality of life [22], help identify vision-related disability [23], and assess the effect of vision loss and the visual field in functional vision [24].

In addition, VR has been used to help detect visual field deficiencies in glaucoma patients in correlation with the Humphrey perimeter [25], and to assess visual functions in patients with strabismus and amblyopia [26, 27]. In the field of retina, the feasibility of displaying OCT images in a VR environment with a head-mounted display has also been examined [28], and they may provide the next generation of OCT image display mode with a platform for patient engagement, medical education, professional training and telecommunications with AI systems to improve processes and decision making.

Therapy

The application of AR and VR in the therapeutic domain includes surgical planning and low vision therapy, amblyopia therapy, and possibly childhood myopia control.

Using VR simulators, surgeons can perform remote telesurgery, minimally invasive surgery, surgical preplanning, image-guided surgery, and simulation using a surgery console. With the recent launch of 5G networks, network speeds have improved dramatically, supporting the advancement of telesurgery, which in turn could lower healthcare costs, while increasing accessibility and the potential for personalized medicine [29].

Currently, and one step ahead of the mentioned next 6G revolution, initial studies are already underway for the seventh generation of cellular wireless communications (7G) network.

7G will integrate a set of previously disparate technologies, including AI, Deep Learning, Mind reading, big data analytics, Internet of Everything (IoE) and Telepresence among others. In addition, the core communications network fabric will also transform as many new technologies converge with 7G.

Several application areas are envisaged for 7G, but it is important to highlight all machine-to-machine communications.

Conclusion

The metaverse could be the response to the exponential development that comes with the digital era. It could provide solutions to a mature healthcare system that cannot address the challenges of the future in a world with an aging global population. The metaverse can change, improve and transform training and medical care, and improve the patient journey pertaining to ocular diseases in diagnosis, early detection and management of chronic and/or degenerative diseases in the medium and long term, by providing innovative and efficient systems in training, medical education, clinical care and personal and professional well-being, in a collaborative environment that should allow healthcare professionals to better manage their most valuable resource, “time”, in order to focus on their true mission: improving the health of patients.

Acknowledgments

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Declarations of interest

David Benet is working at BAYER. This research is a personal project.

References

  1. 1.
    Stephenson N. Snow crash: A novel. New York: Spectra; 2003.
  2. 2.
    About Facebook [Internet]. Social Metaverse Company. 2021. [cited 2022 Jan 1]. Available from https://about.facebook.com.
  3. 3.
    Yang D, Zhou J, Chen R, Song Y, Song Z, Zhang X, Wang Q, Wang K, Zhou C, Sun J, Zhang L. Expert consensus on the metaverse in medicine. Clinical eHealth. 2022;5: 19.
  4. 4.
    Yang DW, Bai CX. The current state and prospects of applying the internet of things in medicine. China Medical News. 2021;36(19):1.
  5. 5.
    Kaur M, Gupta B. Metaverse: the future of internet and cyber-world. Karaikudi: Insights2techinfo; 2021. 1 p.
  6. 6.
    Tong W, Zhu P, editors. 6G: the next horizon: from connected people and things to connected intelligence. Cambridge: Cambridge University Press; 2021. doi:10.1017/9781108989817.
  7. 7.
    Zhu Z, Li X, Chu Z. Three major operating scenarios of 5G: eMBB, mMTC, URLLC, intelligent sensing and communications for internet of everything. Amsterdam: Elsevier; 2022. p. 1576. Chapter 2, doi:10.1016/B978-0-32-385655-3.00006-0.
  8. 8.
    Efficient traffic scheduling for coexistence of eMBB and uRLLC in industrial IoT networks. In: Conference: IEEE WCNC 2022, Austin, TX, USA; April 2021.
  9. 9.
    Ai D, Yang J, Fan J, Zhao Y, Song X, Shen J, Augmented reality based real-time subcutaneous vein imaging system. Biomed Opt Express. 2016;7(7):25652585.
  10. 10.
    Khan A, Chui KT, Peraković D. Future scope of AI and machine learning in 2022. Insights2Techinfo; 2021. 1 p.
  11. 11.
    Rizk SH. Ethical and regulatory challenges of emerging health technologies. In: Vasiliu-Feltes I, Thomason J, editors. Applied ethics in a digital world. Hershey, PA: IGI Global; 2021. p. 84100.
  12. 12.
    Bernardos CJ, Uusitalo MA. European vision for the 6G network ecosystem. Zenodo, Honolulu, HI, USA, Tech. Rep., 2021.
  13. 13.
    Benet D, Pellicer-Valero OJ. Artificial Intelligence: the unstoppable revolution in ophthalmology. Surv Ophthalmol. 2022;67(1):252270.
  14. 14.
    Jaynes C, Deisinger J, Kunz A, Jaynes C, Seales WB, Calvert K, The Metaverse: a networked collection of inexpensive, self-configuring, immersive environments. In: Proceedings of the Workshop on Virtual Environments. 2003. doi:10.1145/769953.769967.
  15. 15.
    Zheng JM, Chan KW, Gibson I. Virtual reality. IEEE Potentials. 1998;17(2):2023.
  16. 16.
    Billinghurst M. Augmented reality in education. New Horizons for Learning. 2002;12(5):15.
  17. 17.
    Kanhere RTS. Position Location for Futuristic Cellular Communications - 5G and Beyond. IEEE Commun Mag. 2021;59(1):7075.
  18. 18.
    Gunasekeran DV, Low R, Gunasekeran R, Chan B, Ong HY, Raje D, Population eye health education using augmented reality and virtual reality: scalable tools during and beyond COVID-19. BMJ Innov. 2021;7: 278283.
  19. 19.
    Ferris JD, Donachie PH, Johnston RL, Barnes B, Olaitan M, Sparrow JM, Royal College of Ophthalmologists’ National Ophthalmology Database study of cataract surgery: report 6. The impact of EyeSi virtual realitytraining on complications rates of cataract surgery performed by first and second year trainees. Br J Ophthalmol. 2020;104: 324329.
  20. 20.
    Cisse C, Angioi K, Luc A, Berrod J-P, Conart J-B. EYESI surgical simulator: validity evidence of the vitreoretinal modules. Acta Ophthalmol. 2019;97: e277e282.
  21. 21.
    Alwadani F, Morsi MS. PixEye virtual reality training has the potential of enhancing proficiency of laser trabeculoplasty performed by medical students: a pilot study. Middle East Afr J Ophthalmol. 2012;19: 120122.
  22. 22.
    Jones PR, Somoskeöy T, Chow WBH, Crabb DP. Seeing other perspectives: evaluating the use of virtual and augmented reality to simulate visual impairments (OpenVisSim). NPJ Digit Med. 2020;3: 19.
  23. 23.
    Lam AK, To E, Weinreb RN, Yu M, Mak H, Lai G, Use of virtual reality simulation to identify vision-related disability in patients with glaucoma. JAMA Ophthalmol. 2020;138: 490498.
  24. 24.
    Gopalakrishnan S, Jacob CES, Kumar M, Karunakaran V, Raman R. Comparison of visual parameters between normal individuals and people with low vision in a virtual environment. Cyberpsychol Behav Soc Netw. 2020;23: 171178.
  25. 25.
    Alawa KA, Nolan RP, Han E, Arboleda A, Durkee H, Sayed MS, Low-cost, smartphone-based frequency doubling technology visual field testing using a head-mounted display. Br J Ophthalmol. 2021;105: 440444.
  26. 26.
    Miao Y, Jeon JY, Park G, Park SW, Heo H. Virtual reality-based measurement of ocular deviation in strabismus. Comput Methods Programs Biomed. 2020;185: 105132.
  27. 27.
    Panachakel JT, Ramakrishnan AG, Manjunath KP. VR glasses based measurement of responses to dichoptic stimuli: a potential tool for quantifying amblyopia? In: 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC); 20–24, July 2020. p. 51065110.
  28. 28.
    Maloca PM, de Carvalho JER, Heeren T, Hasler PW, Mushtaq F, Mon-Williams M, High-performance virtual reality volume rendering of original optical coherence tomography point-cloud data enhanced with real-time ray casting. Transl Vis Sci Technol. 2018;7: 2.
  29. 29.
    Abujassar RS, Yaseen H, Al-Adwan AS. A highly effective route for real-time traffic using an IoT smart algorithm for tele-surgery using 5G networks. J Sens Actuators Netw. 2021;10: 30.

Written by

David Benet and Oscar J. Pellicer-Valero

Article Type: Scoping Review

Date of acceptance: October 2022

Date of publication: November 2022

DOI: 10.5772/dmht.10

Copyright: The Author(s), Licensee IntechOpen, License: CC BY 4.0

Download for free

© The Author(s) 2022. Licensee IntechOpen. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.


Impact of this article

495

Downloads

1009

Views

1

Dimensions Citations

2

Altmetric Score


Share this article

Join us today!

Submit your Article