Open access peer-reviewed chapter

Waveguide-Type Head-Mounted Display System for AR Application

Written By

Munkh-Uchral Erdenebat, Young-Tae Lim, Ki-Chul Kwon, Nyamsuren Darkhanbaatar and Nam Kim

Submitted: 26 October 2017 Reviewed: 10 February 2018 Published: 20 March 2018

DOI: 10.5772/intechopen.75172

From the Edited Volume

State of the Art Virtual Reality and Augmented Reality Knowhow

Edited by Nawaz Mohamudally

Chapter metrics overview

2,909 Chapter Downloads

View Full Metrics

Abstract

Currently, a lot of institutes and industries are working on the development of the virtual reality and augmented reality techniques, and these techniques have been recognized as the determination for the direction of the three-dimensional display development in the near future. In this chapter, we mainly discussed the design and application of several wearable head-mounted display (HMD) systems with the waveguide structure using the in- and out-couplers which are fabricated by the diffractive optical elements or holographic volume gratings. Although the structure is simple, the waveguide-type HMDs are very efficient, especially in the practical applications, especially in the augmented reality applications, which make the device light-weighted. In addition, we reviewed the existing major head-mounted display and augmented reality systems.

Keywords

  • head-mounted display
  • AR application
  • see-through display
  • optical waveguide
  • holographic optical element
  • wedge-shaped holographic waveguide

1. Introduction

Everything in the world has three spatial dimensions which are the width, height, and depth information; people look at the real objects directly, and they see them intact as three-dimensional (3D). But, most of the images displayed on the modern display devices such as TVs, monitors, and mobile displays are two-dimensional (2D); therefore, people have always been interested in 3D imaging systems on how to acquire and display the 3D images properly, especially, after the photographic method has been put to practical use. The 3D imaging technology has been developed since the 1840s, but since 2009, after the stereoscopic 3D movie “Avatar” has been released, it became a hot topic worldwide. Nowadays, the 3D imaging/display systems are widely used in various fields such as entertainment, education, training, biomedical science, and so on.”

A head-mounted display (HMD) which has gained much popularity in recent years after the virtual reality (VR)/augmented reality (AR) application’s boom is a display device that the user wears it on the head like a helmet [1]. Note that, in the application of the combat aircrafts, it is called as a helmet-mounted display that shows the situational awareness of that moment, and the combatant-pilot can control the weapons according to the head pointing direction [2]. Also, the terrestrial guardians and special combatants of special forces of various countries widely use the helmet-mounted displays. Figure 1 shows the actual representation of the typical HMD and helmet-mounted display.

Figure 1.

The actual representation of (a) the typical HMD (source: http://www.5dt.com/head-mounted-displays) and (b) helmet-mounted display (source: https://www.flickr.com/photos/48970996@N04/12682 497224).

Most of the HMDs are based on the stereoscopic 3D display technique. The stereoscopic technique is a simple way to display 3D images through the binocular depth cues of human visual perception considering the interocular distance of the human eyes [3, 4]. It provides slightly different dual two-dimensional (2D) images to the left and right eyes, respectively, and makes sure that each eye gets the corresponding image only, by the use of the special spectacles. Then, both 2D offset images are combined in the brain to give the perception of 3D depth. The general structure of the typical HMD is that two micro displays, lenses, and semitransparent mirrors or prisms are installed in for each eye, as shown in Figure 2(a). Some HMDs has single micro display and lens for one eye, and this structure is named by monocular HMD, as shown in Figure 2(b), where the typical HMD is defined as binocular. Nowadays, a binocular-type HMD is usually applied in the VR-based applications, and monocular-type HMD is mainly used in AR applications.

Figure 2.

General structure of the HMD systems: (a) binocular and (b) monocular.

Then, it can be seen that the AR can be the next generation of VR. Unlike the VR technique, the AR can be seen as a half reality in which any virtual computer-generated perceptual items and features can be viewed and controlled in the real-world environment, instead of the user that is completely sequestered from the real world. In other words, the AR technique is able to overlay the elements of the digital world into a real world. Here, the information of the digital elements can be added into the real world or masked. The user can control the digital elements without using any specific device such as keyboard or mouse, only by the face’s actions such as blinking eyes, and see the text and video on any background where he/she wants. Most of the AP systems are applied through the glass-type monocular HMD or head-up display (HUD). The HUD is a transparent display that is usually located in front of the users, and users observe the important data without looking up or down. Initially, it was developed for the military aviation-based applications, but currently, it is widely used in the automobiles, aircrafts, and so on.

The HMD-based VR and AR technology markets become extremely huge and rapidly growing continuously. For the VR, it has been applied extensively for educational and training purposes in recent years, not only in the entertainment fields such as 3D games and videos. Especially, for the drivers, pilots, and special combatants, it is the efficient and safe way for the training that the trainer can perform various activities in a virtual space similar to the actual environment. Also, VR systems are exploited in the dental clinics for children, in order to reduce the children’s fear during the treatment.

Based on the ability and development of the technology, the interest for the see-through AR systems is increasing acutely. Modern handheld smart mobile devices include a lot of functions such as the Internet use, navigation, image and video processing, high-resolution camera, games, and so on, except the communication, and it was a huge technological revolution. But, the size of the smart mobile devices is limited by the average hand size of the human; so, no matter how good the display resolution is, it gives a lot of inconvenience to users. For the see-through AR systems, the smart mobile devices can be replaced, because the AR systems have all of the functions of mobile devices basically and enough of computing power; additionally, they include the hand-free feature that does not require the interaction of the hands and large-screen experience, while the user wears the AR simply like the eyeglasses. Therefore, experts and developers believe that the AR technology will open the new generation of mobile devices soon.

1.1. Earlier AR systems

The term “augmented reality” has been popularized by T. Caudell and D. Mizell who were the researchers at the Boeing, since 1990. However, a transparent electronic display that displays the virtual objects in the real-world environment has been mentioned the first time in the children novel The Wonderful Wizard of Oz written by L. F. Baum, in 1901, but the actual development of AR systems has begun quite later, since the 1980s.

Actually, the HUD has been developed since the 1940s, but in that time, it has not been commercialized and is only used in the military applications such as in the night fighting combat aircrafts [5]. It was advantageous during the air fight in the night or bad visibility, because the radar display has been projected to the window front of the pilot, and, later, the gunsight has been displayed additionally.

A HUD system for pilots’ training application in the commercial aviation has been introduced by G. Lintern, from the University of Illinois, in 1980. Also, in 1980, S. Mann has invented a wearable computer which is a glass-type computer vision system, “EyeTap.” It was a monocular HMD and the first example of the AR display system which includes the camera, computer, projector, and micro display in front of the human eye. It was allowed to the users that is observing the computer-generated data and real-world environment at the same time. Here, when the camera captures the images, the microcomputer processes the captured images as the computer-generated data, and the processed data is overlaid with the real-world environment through the projector and micro display. But in that time, it was quite difficult to demonstrate the full AR features through the system.

The AR technologies also have been in the TV applications such as in the weather broadcasting that the virtual symbols and information are mixed with the actual earth map. Also, the AR system has been presented precisely in the movie “The Terminator,” in 1984.

The HUD has been applied in the astronomical telescope system that the images of the stars and celestial body images and any other specific information are displayed when the users observe the actual sky images through the eyepiece of the astronomical telescope.

In the 1990s, several popular AR systems have been introduced. The “virtual fixture” which is a first complete functional AR system for the aircraft has been proposed by L. Rosenberg, in 1992. The system provides the improvement of the human performance in both of actual and remotely manipulated processes, and this system was somehow VR technology.

Thereafter, various institutes and researchers developed the AR technologies in the 1990s and 2000s. For example, the ARToolKit has been created by HITLAB. in 1999 (it was allowed for web browser in 2009), an AR game “ARQuake” has been demonstrated in 2000, the Wikitude AR traveling guide for Android mobile devices has been released in 2008, and so on.

In 2013, Google demonstrated the beta version of complete wearable glass-type see-through AR system “Google Glass” which includes the microcomputer, camera, projector, micro-liquid crystal display (LCD), microphone, and speaker. Also, it was able to connect to the Internet and other devices through the wireless network and Bluetooth. It has many functions such that the user can send/receive the emails, see the maps, watch the images and videos, make the voice call, and use the Internet.

In 2014, Sony released their 85% see-through wearable glass-type AR device “SmartEyeGlass.” It includes the basic devices required in the modern AR systems such as microcomputer, micro display, and microphone, and it can be connected to the smart phones. The most important feature is that the system is based on holographic waveguide structure, and this structure makes the device as light-weighted and small.

Another AR technology-based see-through HMD system using the diffractive waveguide structure “HoloLens” which contains the multiple cameras and sensors is released by Microsoft in 2015. Here, they attributed another term “mixed reality (MR),” and this term is much popularized currently. The meaning of the MR is almost similar with the AR that virtual components can be mixed in real-world environment, half real and half virtual.

In 2016, Niantic has released an AR-based mobile game “Pokemon GO,” and it became the most popular smart phone game in short time. Accordingly, the people got more interests in AR technology.

In 2017, Apple Inc. announced that their new operating system for iPhone and iPad products supports the AR contents. Also, in 2017, Microsoft Research presented the prototype of holographic near-eye display which is a see-through display system using holographic waveguide for the AR/VR applications.

Advertisement

2. Waveguide-type HMD system

As mentioned earlier, the AR and/or MR technology is the next generation of VR technology, and it expected to be the direction of the 3D display development in the near future. AR system has a lot more of functions than VR systems that it has been defined as a wearable computer, and the hardware structure is much complicated. Due to the multifunctions, the AR technology requires several electronic and optical devices such as microcomputer, projector, microphone, speaker, and in- and out-couplers; so, if the typical HMD structure is utilized, the system will become very bulky, and it is inconvenient to wear it. Therefore, the miniaturized and light-weighted hardware structure is necessary.

Recently released complete AR technologies such as Google Glass, Epson Moverio, Sony SmartEyeglass, and Microsoft HoloLens use the optical waveguide structure. The waveguide structure makes the AR systems small and light-weighted, and it is the biggest advantage for the users to use an ad hoc or day-to-day basis. This section mainly discusses about the principle of the optical waveguide and the various waveguide for the AR application-based HMD devices.

2.1. Overview of the conventional optical waveguide

The waveguide structure is widely applied in various fields such as electromagnetics, optics, and sound synthesis [6]. It guides the electromagnetic or sound waves through the fibers and pipes. In the optics, it is used as a transmitter that transmits the input signal, the light wave, between two different fields by guiding the waves, without any loss of input information. Figure 3 shows the basic illustration of the optical waveguide where n 1 and n 2 are the refractive indices of both mediums, the core (transmitter) and cladding (reflector), where n 2 is higher than n 1 in the usual cases. The optical waveguides are categorized in several types according to the geometrical structure, the allowance of the number of transmitting signals, refractive index distribution, material, and application.

Figure 3.

The basic structure of the optical waveguide.

The optical fiber is a typical example of the optical waveguide where the light waves are guided through the optical fiber according to the total internal reflection theory. The optical fiber consists of the basic layer of optical waveguide, the core, and the cladding with different refractive indices. Here, the light waves are transmitted through the core with higher refractive index by iteratively reflecting on the wall of the fiber, and the cladding which covered the core. The light waves are passed through the optical fiber without any loss in light quantity and emitted to the target or receiver which is located at the output field.

Also, the prism can be another example of the optical waveguide. Unlike the optical fiber, it transmits the incident light through the body, and the wavelengths of the output light wave leave the prism at different angles to each other, due to the dispersion phenomena caused by the refractive index of the material of prism that it breaks the input light into the complete spectrum of colors according to the wavelength. So, the output light appears as a rainbow.

2.2. Waveguide in the AR-based HMD

In order to implement the virtual components mixed with the real-world environment, it is necessary to design the optical system considering the pupil size, virtual image, optical distance to the eye (eye relief), image magnification, and field of view. Since the optical elements such as mirror and beam splitter are utilized, the overall device becomes bulky and heavy, and it is difficult to mount various elements in a narrow space.

Most of the AR-based HMDs use the waveguide structure in order to reduce the overall size and weight [7]. Actually, some AR devices use the curved mirror structure using the semi-reflective curved mirror as an image reflector in front of the eyes where the projector shoots the image to the curved mirror. However, the displayed image has low resolution and is highly distorted.

Several types of waveguide method are utilized in the modern HMD systems for AR application: the reflective-type, polarized, diffractive, and holographic waveguides. The reflective-type waveguide structure uses the molded plastic substrate as guide of the light waves and semi-reflective mirror in the front of an eye. When the images are displayed by the micro display, magnified by the collimating lens, the collimated light waves are transmitted to the semi-reflective mirror through the waveguide. Finally, the human eye sees the images reflected on the semi-reflective mirror and the real-world environment at the same time. Here, the light waves are guided through the pipe without any loss or degradation, and the images are reflected on the semi-reflective mirror without color nonuniformity. Also, the system provides the high optical efficiency and low cost. The main disadvantage of this structure is that field of view is proportional to the size of the reflector; therefore, in order to increase the field of view, the reflector should be larger, the waveguide should be thick, and, accordingly, the device becomes bulky. The thick waveguide affects to the image quality that the image can be observed as highly distorted. Google Glass and Epson’s commercialized AR device “Moverio” use the reflective-type waveguide structure. Note that Google Glass does not use the total internal reflection phenomena, while Moverio uses it. The schemes of the Google Glass and Moverio are illustrated in Figure 4. In the case of Google Glass, some light waves are transmitted across through the reflector, then come back from the opposite side, and reflect again to the exit pupil. So, it causes the light intensity loss issue.

Figure 4.

The structure of the AR systems (a) Google Glass [Source: https://www.pinterest.com/pin/180355160053768227/] and (b) Moveiro [Source: http://global.epson.com/innovation/engineer/moverio.html].

The polarized waveguide requires multiple layers of coating and polarized reflectors. The polarized reflectors should be parallelized and polished in order to guide the light waves. Some of them are stuck with each other, and each reflector needs to be coated by different amounts of coating. Although this structure has large field of view and large eye motion box, however, it has many drawbacks such as high cost due to the coating, parallelism, low optical efficiency, only up to 30%, color nonuniformity, and too much loss of the incident light waves. Lumus uses this type of waveguide in their see-through AR products.

The diffractive waveguide is one of the most widely used structures in the see-through AR displays. In this case, the incident light waves flow into the waveguide with certain angle by collimating by the first slanted gratings, in-coupler, pass through the waveguide, and extract to the exit pupil via the second slanted gratings, out-coupler. The most important parts of this structure, in- and out-couplers, are produced by the diffractive optical element (DOE) which has the deep slanted nanometric gratings. Currently, the DOE with deep slanted nanometric gratings can be fabricated cheaply anywhere through the common optical component manufacturing; accordingly, the price of AR products using DOE with deep slanted nanometric gratings is also low. However, when the incident light waves are traveled between the in-coupler and out-coupler through the waveguide, the output image is observed with the rainbow effect, because the wavelengths of the reflected light waves have the different amplitudes when they come across the diffraction pattern at the incidence angle. Therefore, it is quite difficult to be applied in the full color applications, and the monochromatic performance-based application is recommended. Also, the size of the display screen cannot be large enough, because of the variation of spectral reflectivity; accordingly, the field of view is quite narrow, averagely 20°. In order to increase the field of view, the incidence angle should be higher, but the color nonuniformity, the rainbow effect, is observed more severely to the user, due to the higher incidence angle. Currently, the diffractive waveguide structure is applied in the see-through AR-systems “Vuzix” manufactured by Nokia and Microsoft HoloLens. Figure 5 shows the schematic illustration of the diffraction waveguide structure and actual representation of Vuzix and HoloLens AR systems.

Figure 5.

(a) The illustration of the diffraction waveguide [source: https://uploadvr.com/waveguides-smartglasses] and the actual representation of (b) Nokia Vuzix [source: https://www.phonearena.com/ news/Vuzix-Blade-and-M300-hands-on_id101589] and (c) Microsoft HoloLens [source: http://www.briteliteimmersive.com/blog/hololens-leading-the-vr-to-ar-revolution].

Advertisement

3. Holographic waveguide for AR-based HMD

3.1. Principle of HOE

The holographic waveguide structure is almost similar with the diffraction waveguide, and the holographic optical element (HOE) is applied as the in- and out-couplers, instead of DOEs. The HOEs reflect the monochromatic (single wavelength) or polychromatic (three wavelengths) light waves, according to the fabrication method. The HOE is fabricated in the analogical hologram recording process with the incident angle using the laser illumination [8]. In the holographic waveguide-based AR display, the light waves from the micro display are reflected on the in-coupler HMD with the incident angle, traveled within the waveguide surface, and out-coupler HOE turns the light back toward the eye of the user. Although, intrinsically, most of the holographic waveguides have few drawbacks such as the limited field of view and rainbow effect, due to the limited incident angle of HOE and three holographic volume gratings (HVGs) with different wavelengths, it is much an advantageous structure especially for the AR displays.

When the HOE is applied to the HMD system for AR application, the number of optical elements is remarkably reduced, and the overall device becomes considerably lighter than the conventional HMD systems. The biggest advantage of the holographic waveguide is that it does not require the glass substrate where other types of waveguide-based AR displays use the glass substrates in the in- and out-couplers, as shown in Figure 6. Here, α is the diffractive angle of the light waves which will be guided through the waveguide between in- and out-couplers, and it should be larger than the critical angle, because light waves are propagated by total internal reflection.

Figure 6.

The schematic diagram of the holographic-type see-through HMD for AR application.

The HOE is an optical element (film) which obtains the desired wave front by recording a hologram through the analogical holographic recording process. HOE can be a lens or lens array, mirror, prism, and so on; so, currently, many conventional optical elements and devices are replaced by the HOEs [9]. The development of hologram recording material is of utmost importance for developing the HOE-based applications such as a see-through AR display.

The principle of the HOE depends on the hologram recording principle. By the use of the coherent illumination such as a laser, the light waves reflected from the object, the object beam, collide with the light waves coming from another direction, the reference beam, on the holographic material. When the object and reference beams are met on the holographic material, the interference pattern consisting of many bright and dark lines occurred according to the phase difference of the reflected object waves from each part of the object. The interference pattern contains both amplitude and phase information of the object. When the reference laser beam illuminated the recorded hologram, the virtual 3D image which is visualized very close to the original object is reconstructed.

The HOEs are classified into the reflection and transmission types, depending on the reconstruction method. The transmission-type HOE transmits the beam through the hologram during the reconstruction. In the reflection-type HOE, the reference beam reflects on the HOE which causes the direction of the object and reference beams to cross with each other in the opposite direction of the recording medium during the hologram recording, as shown inFigure 7(a). The reflection-type HOE transmits the imaging light to the other field by reflecting the band of specific wavelength of light waves; therefore, it is advantageous to apply the reflection-type HOEs in the see-through AR display systems.

Figure 7.

The basic scheme of (a) the hologram recording and (b) optical reconstruction processes.

Usually, the holograms are recorded on the reflective-type HOEs with an asymmetrical grating structure, as illustrated in Figure 7(b). Here, the incident beam is perpendicular to the HOE, in order to reduce the loss of light wave which can be caused by the surface reflection. The object beam is refracted at β angle during the pass through the material due to the difference of refractive indices between air and HOE material.

The most commonly used HOEs are the dichromatic gelatin, silver halide, photoresist, photopolymer, etc. Among them, the photopolymer has a high diffraction efficiency in hologram recording due to a change of the refractive index according to the intensity of illumination. Also, it has the merit of recording the hologram easily by only drying treatment, without requiring the chemical treatment, and it has advantages of high reliability and high resolution. Currently, the photopolymer is widely applied to HOE-based applications, optical filters, holographic memories, and many more.

3.2. Optical characteristics of the HOE

In the see-through AR display systems, the optical characteristics of HOEs, in- and out-couplers, have a significant role. Here, we mainly discussed the optical characteristics of the photopolymer such that it has been recognized as the most effective HOE material especially in the application of holographic waveguides for see-through AR displays.

The diffraction efficiency is the performance of the power throughput of DOEs or HOEs and is the main characteristic of the reflection-type HOE materials. In order to apply the HOEs in the see-through AR displays, the diffraction efficiency should be analyzed on how much optical power can be diffracted into a waveguide compared to incident power onto the DOEs or HOEs. In the holographic waveguide structure, the diffraction efficiency of the HOE has a great influence on the final displayed image quality; so, the diffraction efficiency must be well calculated and measured.

Recently, Piao et al. fully analyzed and measured the diffraction efficiency of the reflection-type photopolymer intrinsically, for the monochromatic and polychromatic cases [9]. In order to measure the diffraction efficiency of the HOE, i.e., the photopolymer, two photodetectors connected to the power meter are used to detect the diffraction and transmission beams, respectively, where the incident angle is set to 45°, in order to prevent the occurrence of the multi-spot by total internal reflection. As the principle of the holographic recording, the HVGs are recorded onto the HOE via laser illumination. Note that most of the laser illuminations have fixed wavelengths, 633 nm for red, 532 nm for green, and 473 nm for blue-colored lasers. Figure 8(a) shows the diffraction efficiency according to the exposure energy. Here, the diffraction efficiency is analyzed when the exposure energy is set to 700 mJ/cm2 at an interval of 100 mJ/cm2. Here, the optimum case of diffraction efficiency is measured that maximum value is approximately 97% at 150 mJ/cm2 (red and green lasers) and 97% at 200 mJ/cm2 (blue laser). Figure 8(b) shows the diffraction efficiency analysis for the monochromatic cases, for red, green, and blue laser illuminations, respectively, according to the incident angle where it is set between 10 and 70° at 5° interval. When the exposure energies are set as 150 mJ/cm2 for red and green lasers and 200 mJ/cm2 for blue laser, the diffraction efficiency is measured approximately more than 95% for the incident angle between 30 and 55°. For the time duration, the diffraction efficiency is almost saturated with more than 95% after 10 seconds, as illustrated in Figure 8(c).

Figure 8.

The diffraction efficiency according to (a) the exposure energy, (b) incident angle, and (c) exposure time.

Basically, the diffraction efficiency of the HOE according to the angular selectivity is calculated by

η = sh ν 2 ξ 2 2 sh ν 2 ξ 2 2 + 1 ξ ν 2 , where ν = πd Δ n λ cos θ R cos θ O and ξ = Kd Δ θ cos ϕ θ B 2 cos θ O E1

where Δn and d are the refractive index modulation and thickness of HOE, λ is the wavelength of laser illumination, θO and θR are the incident angle of object and reference beams, Kis the magnitude of grating vector, Δθ is the angular deviation, ϕ is the angle of inside grating, and θB is the Bragg angle. Figure 9 shows the diffraction efficiencies of actual and calculated case, according to the angular selectivity. Here, it can be seen that the photopolymer provides wide angular selectivity.

Figure 9.

The normalized diffraction efficiency according to the Bragg angle for (a) red, (b) green, and (c) blue laser illuminations.

But in the actual measurement of photopolymer, the diffraction efficiencies of monochromatic cases were measured as 55% for red, 54% for green, and 50% for blue laser, due to the loss of the light waves during the passing through the prism. By considering the absorption factor for each laser that 15, 25, and 28% for red, green, and blue lasers, respectively, the photopolymer provides high diffraction efficiency, similar like the analyzed result. Figure 10(a) shows the actual appearance of HOEs of in- and out-couplers which are the photopolymers and the displayed image from the micro display, and Figure 10(b) shows the final guided images in each monochromatic case of red, green, and blue lasers.

Figure 10.

(a) The actual appearance of the fabricated in- and out-couplers of holographic waveguide and the originally displayed image and (b) the guided images in the monochromatic case: red, green, and blue.

In order to display the full-color image in the holographic waveguide-type AR displays, the three HVGs are recorded on the HOE film by red, green, and blue lasers simultaneously. But its much low diffraction efficiency is measured when the three HVGs are recorded simply through red, green, and blue lasers simultaneously that only 16%, 20%, and 10%, respectively, for each wavelength, so the HOE is useless in the see-though holographic waveguide-type AR display. In order to obtain the higher diffraction efficiency for full-color application in a holographic waveguide, a composite structure of HVG recording should be configured. Here, two-layered photopolymers are laminated such that the first photopolymer for red laser and second HOE for green and blue lasers are combined, as shown in Figure 11. When the fabricated HOE film is applied as in- and out-couplers, the diffraction efficiency is measured approximately 40, 44, and 42%, respectively, and it does not seem to be a problem when applied in the see-through AR displays. The example of the full-color guided images within the holographic waveguide is also presented in Figure 11.

Figure 11.

The schematic configuration of a composite structure and the example of the guided full-color image.

Additionally, if three-layered photopolymer is utilized, most exterior layer loses too much of the light beams. When three photopolymers are located in order for red, green, and blue lasers, the diffraction efficiencies are only 10, 23, and 30% that is impossible to be applied.

Due to the high diffraction efficiency and light weight of the holographic waveguide structure, currently, it is applied in several see-through AR display systems such as Sony SmartEyeglass, BAE Systems AR glasses, DigiLens’s HUD systems, etc. Also, Microsoft Research uses a holographic waveguide in their new research of near-eye see-through holographic display.

3.3. Wedge-shaped holographic waveguide

Actually, the composite structure for full-color HOE cannot solve the issues of holographic waveguide such as the thickness, weight, color nonuniformity, and field of view, fully. Then, the wedge-shaped structure of holographic waveguide has a great advantage that obtains the improved field of view and color uniformity through the HOE with high diffraction efficiency [10]. Figure 12(a) shows the schematic configuration of wedge-shaped structure of holographic waveguide. Here, the in- and out-couplers of HOE films are located at the certain angles; therefore, the thickness of the waveguide can be reduced by a large angle of total internal reflection, and field of view can be widened according to the wide angular selectivity of the HVGs. For the angle of the light path inside the wedge-shaped holographic waveguide, as shown in Figure 12(b), the sum of the incident angles on the outside of the HOE films, θ1 and θ2, are identical to the angle of total internal reflection, θt, where the slope angle of wedge, θt, is same with θ2. Here, the spatial frequency of the in- and out-coupler HVGs should be considered, and it can be calculated as

f = 2 n cos φ θ 0 λ E2

where n is the average refractive index of the HOE (1.58 for photopolymer), φ, is the grating slated angle and θ0 is the incident angle of the inside the HOE.

Figure 12.

The schematic configuration of a wedge-shaped holographic waveguide structure and (b) angle of the light path inside the waveguide.

According to the above calculation, when θw is designed by 30° (angle of total internal reflection should be larger than the critical angle), θ1 can be 20, 30, 40, or 50°; the total internal reflection occurred at all angles of incidence θ1; and the spatial frequencies are corresponded to each incident angle as 5930, 5617, 5931, and 5904 lines/mm, respectively. When considering the photopolymer as a HOE that the thickness is 16.8 µm, the angular selectivity for HVG recordings become larger at higher spatial frequencies; but, the wavelength selectivity exhibits the similar tendency, as shown in Figure 13(a) and (b). So, it can be seen that the recording incident angle at 30/20 and 30/40° on the outside of HOE achieves very wide angular selectivity and narrow spectral selectivity. Therefore, it is suitable for the incident angle of HVGs that are set as θ1 = 40 and θ2 = 30°, because larger angle of total internal reflection leads to a thinner holographic waveguide.

Figure 13.

Analysis for (a) the angular selectivity and (b) the spectral selectivity, for the corresponding spatial frequencies of each incident angle.

For the field of view, it can be calculated as

θ FOV = arcsin n sin φ arccos λ 1 cos φ θ 0 λ 0 E3

where λ0 and λ1 are the wavelengths of the recording and readout beams. According to Eq. (3), the field of view can be analyzed for the different wavelengths of illumination that between ±17° in 430–500 and 580–670 nm wavelengths and ±15° in 500–560 nm.

The wedge-shaped holographic waveguide structure requires the thin films for in- and out-couplers, so it is necessary to record the full-color HVGs onto a single HOE film. As mentioned earlier, the simple recording method using three lasers is difficult to apply because of very low diffraction efficiency. Therefore, the time scheduling method must be utilized that the shutters control the exposure duration of each laser illumination. Here, when the HVG is recorded onto the HOE film by any laser illumination, other two lasers are blocked by the corresponding shutters. The saturation grating for each HVG should be within 5 seconds, because it will be difficult to form the grating on the HOE film after 5 seconds. The biggest advantage of this recording method is that the full-color HVG can be recorded onto a single HOE film with good uniformity of color and short recording time, and the maximum uniformity can be with only 10% loss in diffraction efficiency.

But in the actual case, especially for the photopolymer, it has the absorption and transmittance for each color of laser. Piao et al. measured the diffraction efficiencies of full-color single HOE in the six different cases for the sequence of three laser illuminations: red-green-blue, red-blue-green, green-red-blue, green-blue-red, blue-red-green, and blue-green-red. The optimum case is green-blue-red sequence, because this case exhibits much higher optical efficiency and more uniform distribution than other sequential recording processes and has a high potential for obtaining high image quality that the diffraction efficiencies are measured as 49, 47, and 44% for each wavelength. Figure 14 shows the color analysis for the different sequential exposures. Here, it can be seen that the green-blue-red sequence has successful higher color uniformity than other cases.

Figure 14.

(a) For the given input image: The detected color uniformity for (b) red-green-blue, (c) red-blue-green, (d) green-red-blue, (e) green-blue-red, (f) blue-red-green, and (g) blue-green-red sequential beam exposures.

When the HVGs are recorded onto the photopolymer films by green-blue-red sequential beam exposure, and applied into the see-through AR display system, the guided image through the wedge-shaped waveguide is observed very accurately, as shown in Figure 15. Here, the diffraction efficiency was measured by 30, 34, and 28% for the monochromatic cases of red, green, and blue lasers and 31% for the full-color visualization using the green-blue-red sequence. The diffraction efficiency is fine, even though full-color HVGs are recorded onto a single HOE. Figure 15 shows the full-color visualizations for the given test image when the HOEs are fabricated by green-blue-red sequential exposure and applied into the AR display system using the wedge-shaped holographic waveguide. Note that micro display using a light emitting diode source was laid at the focal plane of the collimating lens, and the plane waves formed, in the AR display system.

Figure 15.

(a) The original test image, (b) a full-color visualization, and (c) the output image illuminated by the light emitting diode source.

Advertisement

4. Conclusions

Nowadays, the general consumers and developers are widely interested in AR/VR technology. Especially, AR technology has been recognized that it is a technology that has a much big change in mobile technology fields. Modern smart mobile devices have limited size, and it must be touched by the user’s hand, but the see-through AR systems are hand-free and controlled only by the eye of the user. Also, it has almost every function of mobile devices such as voice call, Internet connection, video playback, virtual game, and so on and mixes the virtual objects with the real-world environment. It has been developed since the 1940s, and there is a long way to go for the development. In order to make the AR displays light-weight, the optical waveguide structure is usually utilized. Several types are existed such as reflective, polarized, diffractive, and holographic waveguides; among them, the holographic waveguide structured is an efficient way to display the clear images on the large screen with wide field of view for the comfortable viewing and make the overall AR device as small and light weight. The holographic waveguide structure requires the HOEs as the in- and out-couplers with the incident diffracting angle. Among the HOE materials, the photopolymer is advantageous that it does not require the post-processing, and it provides high diffraction, high reliability, and high resolution; it is the most suitable HOE material in the see-through AR display system. Especially, the diffraction efficiency is quite high for the both monochromatic and full-color visualizations. Also, when the full-color HVGs are recorded onto a single HOE film and applied into the wedge-shaped holographic waveguide, the entire AR display system can be more light-weighted. Currently, the technological companies such as Sony, Konica-Minolta, DigiLens, and BAE Systems use the holographic-type waveguide structure in their see-through AR application-based HMDs and HUDs. Also, Microsoft Research is working hard to release their new near-eye see-through holographic display using a holographic waveguide in the near future.

Advertisement

Acknowledgments

This research was supported by the Government of Korea, under the ITRC (Information Technology Research Center) support program (IITP-2017-2015-0-00448) supervised by the IITP (Institute for Information and Communications Technology Promotion) and supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (No. NRF-2017R1A2B4012096).

References

  1. 1. Melzer J, Moffitt K. Head-Mounted Displays: Designing for the User. Columbus, United States: McGraw-Hill Education; 1997
  2. 2. Rash CE. Helmet-Mounted Displays: Design Issues for Rotary-Wing Aircraft. 2nd ed. Bellingham, United States: SPIE Press; 2001
  3. 3. Wheatstone C. Contributions to the physiology of vision. Part the first. On some remarkable, and hitherto unobserved, phenomena of binocular vision. Philosophical Transactions. Royal Society of London. 1838;128:371-394
  4. 4. Kwon K-C, Im CY, Seo KS, Nam SM, Erdenebat M-U, Shim YB, Han Y-G, Kim N. Three-dimensional visualization system for ophthalmic microscopes using visible light and near-infrared illumination. Journal of Biophotonics. 2018:53. DOI: http://10.1002/jbio.201600268
  5. 5. Wikipedia. Head-up Display. 2009. Available from: https://en.wikipedia.org/wiki/Head-up_display
  6. 6. Snyder AW, Love JD. Optical Waveguide Theory. Abingdon, United Kingdom: Chapman and Hall, Taylor & Francis Group; 1983
  7. 7. Sarayeddine K, Mirza K. Key challenges to affordable see-through wearable displays: The missing link for mobile AR mass deployment. In: Proceedings of the SPIE Defense, Security, and Sensing, April 29–May 3, 2013. United States: SPIE; 2013. p. 87200D
  8. 8. Kim N, Piao Y-L, Wu H-Y. Holographic optical elements and application. In: Naydenova I, Nazarova D, Babeva T, editors. Holographic Materials and Optical Systems. Rijeka, Croatia: InTech, Prolaz Marije Krucifikse Kozulić; 2017. pp. 99-131. DOI: http://dx.doi.org/10.5772/67297
  9. 9. Piao J-A, Li G, Piao M-L, Kim N. Full color holographic optical element fabrication for waveguide-type head mounted display using photopolymer. Journal of the Optical Society of Korea. 2013;17:242-248
  10. 10. Piao M-L, Kim N. Achieving high levels of color uniformity and optical efficiency for a wedge-shaped waveguide head-mounted display using a photopolymer. Applied Optics. 2014;53:2180-2186

Written By

Munkh-Uchral Erdenebat, Young-Tae Lim, Ki-Chul Kwon, Nyamsuren Darkhanbaatar and Nam Kim

Submitted: 26 October 2017 Reviewed: 10 February 2018 Published: 20 March 2018