Open access peer-reviewed chapter

Advancements in Optical See-through Near-Eye Display

Written By

Jufan Zhang, Yao Zhou and Fengzhou Fang

Submitted: 13 September 2022 Reviewed: 27 September 2022 Published: 22 October 2022

DOI: 10.5772/intechopen.108309

From the Edited Volume

Modern Development and Challenges in Virtual Reality

Edited by Mamata Rath and Tushar Kanta Samal

Chapter metrics overview

192 Chapter Downloads

View Full Metrics

Abstract

With the development of optical design and manufacturing, the optical see-through near-eye display becomes a promising research topic in recent decades, which can be applied in medical devices, education, aviation, entertainment et al. Typical products include Head-mounted Displays (HMDs) and Augmented Reality (AR) glasses. The optical display system of AR devices mainly consists of a miniature projecting module and an optical imaging module. In this chapter, the display systems used by AR glasses on the market, including various mini-display screens and optical imaging elements, have been systematically summarized. Therein, the differences in optical combinators are the key part to distinguish various AR display systems. Thus, it is essential to figure out the advantages and disadvantages of each optical imaging technology applied in this area. Besides, the characteristics of the projectors are crucial to the quality of the images.

Keywords

  • see-through near-eye display
  • augmented reality
  • optics
  • metaverse
  • projector
  • freeform optics
  • waveguide
  • off-axis optics
  • birdbath optics

1. Introduction

The optical see-through near-eye display is a promising solution for various industrial sectors, such as education, entertainment, military, tourism, etc. Typical products include Head-mounted Displays (HMDs) and Augmented Reality (AR) glasses. Since Facebook announced the Metaverse, more and more companies released their products. This technology is a medium and continuum that spans between the real world and the virtual world [1, 2, 3], by integrating the real environment with the projected virtual objects. Technically, the optical module combines the imaging system and the projecting system, as shown in Figure 1, both of which directly influence the optical performance and imaging quality. The light from the projector is transferred by the imaging optics without blocking the view of the real world.

Figure 1.

Schematic diagram of the optical see through near-eye display.

Over the past decades, there is a significant advancement in the see-through near-eye display. Although there are a number of technical drawbacks and problems limiting its success in the consumer market, the achievements made in the enterprise market have witnessed value and high commercial potential. The latest technological progress brings us new possibilities for a wider range of applications in the near future [4, 5, 6, 7]. This chapter focuses on the current mainstream optical imaging technologies and projecting technologies, along with typical applications, to offer the readers a general idea on the state of the art.

1.1 Applications

With the maturity of AR technology, AR is increasingly used in various industries, such as education, training, medical care, design, advertising, etc. AR has injected new vitality into education with its rich interactivity. Compared with stuffy paper books, AR combines text and dynamic 3D images together, which provides an immersive feeling and facilitates a quicker understanding of the information. AR enhances the clarity and intuitiveness and perceptual impact of real situations, making situational learning more friendly, dynamic, and natural [8, 9, 10, 11]. E.g., for training on medical and clinic treatment, the application of AR makes the esoteric and profound medical theories livelier and more concrete, which greatly improves teaching efficiency and quality, especially in minimally invasive surgery [12, 13, 14, 15]. In tourism, such as museums, AR provides virtual text, pictures, videos, and other information for the introduction and description of exhibits. And AR has also been applied for restoring and displaying cultural relics by virtually filling up the incomplete part, which brings an immersive feeling to tourists [16, 17]. In addition, in the industry, based on AR smart glasses, data collection and processing are carried out through the AR cloud to provide visual information for the technical support team, therefore realizing expert-level remote assistance. This truly makes the communication of industrial issues more direct, accurate, efficient through visualized way, and eliminates the risk of unavailability of qualified experts in the field for urgent problems, and also significantly saves cost and time. AR helps abandon complicated work manuals, flowcharts, walkie-talkies, etc., and completely liberates the hands of workers/operators [18]. In the military, pilots can use HMD to observe navigation information and even crucial information about the enemy. The Synthetic Training Environment based on AR systems helps soldiers be trained in a more immersive way, by placing them in a more physically and psychologically stressful combat environment [19, 20, 21]. In the entertainment industry, AR is applied to create interactive games, like racetracks, and basketball fields, for which the camera can track the locations or even the body language to give a more accurate response [22, 23, 24].

Among the above application fields, the most commonly used hardware are head-mounted displays (HMDs) and handheld devices like AR glasses. HMD is worn on the head like a see-through helmet, which provides images in front of the eyes. An HMD can facilitate many uses including aviation, gaming, medicine, and engineering. AR glasses are worn like regular glasses in front of the eye, but function in a different way. Smart AR glasses are computer-enabled wearable devices that add virtual information to a user’s real-world scene by superimposing computer-generated or digital information, such as 3D images, animations and videos. The information can be retrieved from PC, phones, or other devices, which can be supported by Wi-Fi, mobile data, or Bluetooth.

1.2 Key optical parameters

The optical performance is closely affected by the specific parameters of the display system. Rather than a single parameter determining the optical performance, there is a trade-off among the following major parameters.

Field of view It refers to the solid angle between the outline of the object observed by the human eye and the line connecting the center of the pupil of the human eye. FOV includes the vertical field of view, horizontal field of view, and diagonal field of view [25]. Normally, the size of a person’s retina is limited, so the corresponding viewing range of the human eye is also limited. FOV is the major indicator that many developers are concerned with primarily.

Eye relief Eye relief of an optical display represents the distance from the exit pupil area to the optical combiner within which the user’s eye can obtain a full view and clear images. Most near-eye displays need binoculars or monocular with a minimum of 16 mm eye relief [26]. It is an important parameter to comfort the use.

Eye box Eye box is defined by the space within which the images can still be effectively viewable. So, images within the eye relief can be observed in both angular and lateral movements of the human eye. It’s how far off center your eye can be and still see through the scope properly. Within the eye box area, the observer at any position can reach the entire FOV. Exceeding this area may result in distorted images, incorrect color rendering, or even no content [27]. A larger eye box allows the user to have greater freedom and head movement to observe the whole visual image.

Chromatic aberration Chromatic aberration refers to the phenomenon that optical lenses cannot focus all wavelengths of colored light on the same point. Chromatic aberration is caused by the phenomenon of lens scattering. It may appear as blurred or obvious color fringing around objects in the image, especially in the case of high contrast. Only one point will be focused by a perfect lens with the smallest circle of a blur. According to the wavelength dispersion of different planes, chromatic aberration can be divided into two types: longitudinal chromatic aberration and lateral chromatic aberration [28, 29, 30]. The optical element in the combiner may cause chromatic aberration, which affects the optical performance and imaging quality.

Distortion Lens distortion is actually a general term for the inherent perspective distortion of optical lenses, that is, the distortion caused by perspective. There are three kinds of distortions including pincushion distortion, barrel distortion, and linear distortion. Pincushion distortion is a phenomenon caused by the lens “shrinking” the picture toward the center. Barrel distortion is a barrel-shaped expansion caused by the physical properties of the lens and the combined structure of the lens. Linear distortion is defined as a change in amplitude or phase with no new frequencies added [31, 32, 33].

Stray light For the imaging optical system, any undesired light that propagates to the detector face can be defined as stray light. Due to the multiple optical elements and complex architecture used in an integrated display system, stray light may be caused by diffraction, unwanted reflection, and scattering. In a sense, stray light has a veto effect on optical systems. If the stray light affects the imaging quality, all ray paths should be traced back to the receiver for defects shooting. However, stray light cannot be completely eliminated, but can only be suppressed to a certain extent [3435]. As long as stray light is controlled within an imperceptible range by human eyes, or within some acceptable or permissible extent defined by users, it is regarded as the completion of stray light suppression.

Brightness and transmittance Brightness refers to the amount of light in the virtual image displayed by the optical system. Enough brightness allows you to see the image clearly in direct sunlight. It is also one of the major challenges faced by current AR headsets. The brightness of most existing AR glasses can be only used indoors, and is almost unusable outdoors. In order to alleviate this problem, some AR headsets use optical designs such as birdbath to block ambient light or use tinted lenses to improve the relative brightness of the optical module, but the associated adverse effect is reducing the light transmittance of the optical module. Light transmittance refers to how much ambient light the human eye can receive through the optical element [36, 37, 38, 39]. The ideal light transmittance is 100%. Of course, it is still difficult for the existing AR optical technology to achieve this. Low light transmittance may be acceptable to consumers for some specific application scenarios, but it is generally unacceptable in many professional/industrial application scenarios because light transmittance has a great impact on job safety.

Resolution and contrast Resolution refers to the number of pixels a display can cover, and the optimal display resolution should be close to or slightly beyond the limits of human vision. There’s no official definition or measurement of contrast or contrast ratio, but to most people, it’s a perception, simply a display’s ability to produce both bright and dark pixels. With low contrast, bright content and dark content will not be displayed correctly. In the optical perspective AR display system, dark or black color is hard to render, so high transparency areas may appear dark color in low contrast. In short, the brighter the AR display, the higher the requirement for contrast. For AR display, the color perceived by human eyes is also related to the real environment background superimposed by the virtual image. As with contrast, pixels vary in color quality depending on where they are on the display [40, 41]. For example, the same pixel color may look different on the left and right sides of the display with distinct image patterns, as well as depending on the location of the user’s pupils.

Vergence accommodation Human perception of a three-dimensional environment can be divided into psychological perception and physiological perception. Psychological perception includes five aspects of visual suggestion, such as shadow, occlusion, light, affine and texture, and prior knowledge. The physiological perception of stereoscopic vision mainly includes blurred focus, moving parallax, and binocular parallax. There are two main reasons for vertigo: [1] conflicts between binocular parallax and focusing blur on visual perception; [2] the conflict between motion perception and visual perception [6, 42, 43, 44, 45, 46].

Size and weight Size is one of the biggest challenges of the see-through near-eye display. Larger FOV and eye box always mean bigger size and weight at the same time. The larger size usually implies inconvenience to wear and tends to block sight. Besides, there is a limit to the amount of weight that the human ear, bridge of the nose, and top of the head can hold.

Delay All virtual images are produced by the projectors; thus, the response time of hardware is very important for the reaction of the human brain and eye. An imaging delay of fewer than 5 milliseconds is generally considered sufficient for optical perspective systems. A longer delay would cause dizziness.

Advertisement

2. Optical imaging technology

The optical display system of see-through near-eye devices is usually composed of micro-projector and imaging components. In general, the display system of AR glasses on the market at present is a combination of various micro-projectors and imaging components such as prisms, free-form surfaces, Birdbath, optical waveguides, etc. The difference in optical combination is the key part to distinguish the display systems.

2.1 Conventional optics

2.1.1 Off-axis optics/birdbath optics

More HMD optical systems have applied folding/reflecting structures with the high demands of modern air combat. The structure can meet the technical requirements of large eye relief, large exit pupil diameter, and large FOV. The system consists of two basic types: coaxial (rotational symmetric geometry) and off-axis (rotational asymmetric geometry). Compared with the coaxial system, the off-axis system can reach a wider FOV. Therefore, most HMD optical systems adopt the off-axis display [47, 48]. An example of the off-axis structure is illustrated in Figure 1, which reaches 40° × 30° FOV, 15 mm exit pupil diameter, 26.4 mm focal length and 25 mm eye relief. However, off-axis optical system usually contains many optical elements, which means larger size and heavier weight, so not suitable for long time use.

Another conventional technology is the birdbath structure. The polarized beam splitter (PBS) is used to split the light beam from the light source to the human eyes, which can be a cube or a film. Most products are applying this technology currently, such as google glass and Lenovo headsets, as shown in Figure 2. Figure 2a shows how google glasses work, which mainly includes a projector and a beam splitter (BS) prism. When the projected light travels to the BS prism, part of it is reflected to human eyes and produces virtual images, without blocking the light from the real world. Figure 2b shows how Lenovo headset works. The structure applies a plate beam splitter that allows light from a top-mounted display facing the bottom to pass through it and is reflected by a spherical mirror to the other side of the BS, to the viewer’s eyes. The birdbath design can achieve larger FOV and larger eye relief, meanwhile, larger sizes are needed compared with off-axis technology [49, 50, 51]. As shown in Figure 3, the design space is defined by the thickness as a function of the eye box size. The window limits the minimum and maximum values on the eye box [52]. When applying the birdbath technology, the size and weight become proportional to the eye box and FOV, which will also constrain some other optical parameters (Figure 4).

Figure 2.

Optical system of off-axis system [44].

Figure 3.

Example of the birdbath structure in the see-through near-eye display. (a) The schematic diagram of Google Glass [51]. (b) The schematic diagram of Lenovo headsets.

Figure 4.

Typical design space for specific interpupillary distance (IPD) coverage and ID requirements [52].

2.1.2 Freeform optics

Freeform optics is defined as any non-rotationally symmetric surface or micro-array surface, which is very different from spherical and aspherical geometry. Freeform lenses can enable unique optical performance, such as low f-number, large eye relief, and wide FOV. One possible form of freeform optics is the eccentric use of rotating symmetrical lens, thus accommodating off-axis ranges. There are three main ways to describe free-form surfaces, including NURBS, XY polynomials, and radial basis function representation. The use of freeform optics reduces the number of components in an optical system, resulting in a smaller, lighter, and more efficient system [53, 54, 55, 56]. The biggest advantage of freeform surfaces is that they can achieve very good imaging quality due to their special geometry and a high degree of freedom, as shown in Figure 5. However, compared with the traditional symmetric lens, the optical design and manufacturing process are more complex and demanding [57, 58, 59, 60].

Figure 5.

Layout of the see-through HMD by coupling the FFS prism lens [53].

2.2 Waveguide optics

Optical waveguide technology is a distinctive optical component developed to meet the demands of the see-through near-eye display. Light travels at different speeds in different substances. Thus, when the light goes from one substance to another, it refracts and reflects at the interface of the two substances. Moreover, the angle of the refracted light changes with the angle of the incident light. When the incident angle reaches or exceeds a critical angle, the incident light will be reflected back without refraction, which is called the total internal reflection (TIR) of light. According to the principle of TIR in geometric optics, the light produces TIR at the interface between the waveguide and the air and forms the necessary conditions for the light to be bounced inside the waveguide and propagated forward without exiting the waveguide, as shown in Figure 6 [61]. In virtue of the waveguide transmission mode, the display and imaging system can be moved away to the top of the forehead or the ear side of observer, which greatly reduces the obstruction of the optical system to the outside view, and makes the weight distribution more ergonomic, thus improving the wearing experience of the device [62, 63, 64]. The optical waveguide can be divided into two types, the geometric waveguide and the diffraction waveguide, mainly based on the coupling structure of light in and out of the waveguide.

Figure 6.

The layout of the waveguide [61].

2.2.1 Diffractive waveguide

Diffractive optical waveguide includes surface relief grating (SRG) waveguide fabricated by lithography technology and volume holographic grating (VHG) waveguide fabricated by holographic interference technology, as shown in Figure 7. For the diffractive waveguide, the light source includes light emitting diode (LED), organic light emitting diode (OLED), liquid crystal on silicon (LCOS) and laser scanning display (LSD). The light from the light source is collimated to the waveguide in one direction. The in-coupler diffractive optical element (DOE) modulates the light propagation within the waveguide by TIR. When the light travels to the out-coupler, the condition of TIR is broken and the light is coupled out to human eyes. The virtual images are projected at infinity due to the parallel collimated light [65, 66, 67, 68, 69]. The waveguide substrate is transparent which projects the virtual image without blocking the views from the real world.

Figure 7.

The schematic of the diffractive waveguides [69]. (A) The schematic diagram of the SRG waveguide. (B) The schematic diagram of the VHG waveguide.

SRG is an optical element with a periodic fluctuation structure at the wavelength scale on the surface of the material which can spatially modulate the light. VHG is the light–dark interference fringe formed by holographic technology exposure inside the material. They both cause a periodic change of the refractive index in the material. This periodic change is generally on the scale of the wavelength of visible light (450-700 nm), to allow effective manipulation of light for normal display [70, 71]. A single wavelength can be divided into several diffraction orders by the diffractive grating, and each diffraction order will continue to propagate in a different direction, as shown in Figure 8. The diffraction angle of each diffraction order is determined by the incident angle of the light and the period of the grating. By optimizing associated parameters of the grating, such as the refractive index of the material, grating shape, thickness, duty cycle, etc., the diffraction efficiency of a certain diffraction order can be improved to the maximum, so that most of the light will keep propagating along this direction after diffraction [72, 73]. The diffraction modes include reflection diffraction and transmission diffraction, as shown in Figure 9. For the same grating period, the diffraction angles of different wavelengths are also different. Usually, there are three wavelengths applied for imaging, including red, green, and blue (RGB) elements.

Figure 8.

Diffraction grating [71].

Figure 9.

Angular selectivity and spectral selectivity of transmission and reflection VHG [74].

However, the diffraction efficiency of the same color fluctuates depending on the incident angle, resulting in a different proportion of red, green, and blue light across the entire field of view (FOV), which is called the rainbow effect. This is caused by the inherent physical characteristics of diffraction gratings, and the color uniformity problem can only be optimized but not completely eliminated by design. Besides, the optical efficiency of the diffractive elements is very low, with nearly 85% of the projected light blocked, so only a small part can reach the human eyes. As a result, usually, an extra shading lens is needed for the observer to see the virtual images clearly and the contrast of the whole system is also reduced.

2.2.2 Geometrical waveguide

The reflective coating film is used as the in/out-coupler for the light propagating. The light is totally internally reflected by the in-coupler and bounces within the waveguide. When the light hits the out-coupler, part of the light is reflected out of the waveguide to human eyes while another part transmits through it for further propagating to the next out-coupler, as shown in Figure 10 [75]. The partially reflected mirror array (PRMA) expands the exit pupil and achieves uniform light for virtual images.

Figure 10.

The schematic of the geometrical waveguide [75].

Since the light will be reflected out from each out-coupler which makes the light density arriving at each out-coupler distinct, each of the out-couplers in the array needs a different reflection and transmission ratio (R/T) to ensure that the amount of light coupled out within the whole range of the eye pupil is uniform. The wavelength will not affect the imaging quality, but the R/T ratio would result in fringes in light and dark. The PRMA replicates the pupil to increase the total exit pupil area, however, it reduces the amount of light at each exit pupil area. Thus, the efficiency of the geometrical waveguide is less than that of conventional optical systems. Compared with the diffractive waveguide, manufacturing of the geometrical waveguide is more complicated, as shown in Figure 11. It presents rigorous requirements to each step of the process chain for high manufacturing precision, usually at the micro and nanoscale.

Figure 11.

The manufacturing process of the geometrical waveguide.

Waveguide technologies have distinctive advantages in thickness and FOV, which makes them become a mainstream solution in AR display rapidly. Table 1 is a comparison between diffractive waveguide and geometrical waveguide.

Waveguide CategoryDiffractive waveguideGeometrical waveguide
CompaniesMicrosoft, Magic Leap, Waveoptics, etc.Digilens, Sony, etc.Lumus, Lochn, Lingxi, etc.
In/Out-couplerSRGVHGReflective film
FOV60°50°60°
Eye box12 mm x 11 mm12 mm x 10 mm12 mm x 10 mm
Thickness2.65 mm< 2.5 mm1.5 mm
Eye relief18 mm19 mm16 mm
ManufacturingLithography technology, nanoimprintHolographic interferenceCoating, stacking, slicing, polishing, shaping

Table 1.

Comparison between diffractive waveguide and geometrical waveguide.

2.3 Light field technology

2.3.1 Pinlight technology

Pinlight display is a novel design to offer a wide FOV, supporting a compact form factor. Andrew et al. demonstrated the feasibility through software simulation and prototype display [76]. The defocused point light source is directly encoded. The spatial light modulation (SLM) between the pinlight and the eye modulates every pixel of the light, as shown in Figure 12. The encoded light is projected to human eyes to create virtual images by the pinlight plane and the SLM. There is an array of small hexagonal pits on it that capture total internal reflection light and create a precise spot of light. People can observe the integrated virtual images by the coded series of light [76, 77]. However, the non-uniformity of the shape and intensity is still complex and difficult to be solved. The eye box and eye relief are also limited.

Figure 12.

Schematic of pinlight technology [76].

2.3.2 Pinhole technology

The pinhole technology has been applied in LetinAR. The vertically polarized light from the display is reflected by the beam splitter to human eyes. Polarized light can only enter the human eye after passing through the pinhole, and most of the light is blocked by the flat plate. Thus, the light efficiency is decreased due to the imaging principle. Pinhole technology can achieve smaller thickness and a high transparent ratio of the element [78, 79]. The eye box is related to the number of pinholes, and the virtual images can only be observed in a very limited eye relief range (Figure 13).

Figure 13.

Schematic of pinhole technology. (a) Red lines are from the projector. Most lines are locked by the polarizing plate and only the light passing through the pinhole can transfer to human eyes. (b) Blue lines are the light from real world [79].

2.4 Comparison

Among the above technologies, FOV, eye relief, eye box, thickness, distortion, size, weight, and transmittance are important optical parameters to evaluate the imaging performance. There is a trade-off among these correlated parameters thus achieving a perfect system performance is impossible. Different applications may focus on different key optical indicators based on the specific expectation and may sacrifice some secondary optical parameters. The difficulty of manufacturing affects whether the system can be mass-produced or not. Table 2 compares the above imaging technologies [80, 81, 82].

Optical categoryConventional opticsWaveguide opticsLight field optics
FOVSmallLargeMedium
Eye reliefLargeLargeSmall
Eye boxSmallLargeSmall
ThicknessThickThinThin
DistortionMoreLessLess
Size and weightHeavyLightLight
TransmittanceLowHighHigh
ManufacturingEasyHarderHarder
CompanyMeta, Epson, Google glassLumus, Lingxi, MicrosoftCreal

Table 2.

Comparison of the above imaging technologies.

Advertisement

3. Projecting technology

With the development of augmented reality graphics, the optical performance of the projector also affects the imaging quality of the virtual images. There are some major parameters deciding the performance of the projector screen, such as the pixel, resolution, pixel per inch (PPI), and contrast. The pictures on the projector screen are made of millions of pixels. Each pixel is composed of three-color pixels, red, green, and blue (RGB). The variety of colorful pictures is projected by adjusting RGB. Resolution is the number of pixels. PPI is the number of pixels per inch on the screen. The higher the PPI, the smaller the pixel size, and the sharper it is. Screen contrast refers to the ratio between the brightness of black and white. The higher the contrast, the brighter and more colorful the images.

3.1 LCD

A liquid-crystal display (LCD) is a flat-panel display technology commonly applied in televisions and computers, etc. The first mass-produced LCD panel technology is twisted nematic (TN). As shown in Figure 14, when there is no electric field on the liquid crystal modules, the molecules in the LCD cell twist by 90 degrees. When the light from the environment and backlight passes through the first polarizer, the light is polarized and distorted by the liquid crystal molecular layer. When it reaches the second polarizer, it is blocked. Thus, the viewer observes the black screen. When there is an electric field applied to the liquid crystal molecules, they unravel. When polarized light reaches the liquid crystal molecular layer, the light transmits directly without distortion. When it reaches the second polarizer, it can pass through, and the observed screen is bright. The electric field rather than the current makes the technology consume less power. By controlling the voltage, the deflection angle of the liquid crystal layer can be adjusted, and the brightness of each sub-pixel of RGB can be controlled respectively. By changing the brightness ratio, all colors can be realized by mixing RGB in different proportions. Due to the liquid crystal molecules cannot close completely, LCD cannot show pure black. LCD technology has the great advantage of being light, thin and low power consumption [83, 84, 85]. However, its drawbacks include slow response time (especially at low temperatures), limited viewing angles, and backlighting requirements.

Figure 14.

Working principle of LCD [83].

3.2 OLED

A single Organic light-emitting diode (OLED) forms a pixel on the screen thus there are millions of OLED dots working together for the display. As shown in Figure 15, there is an organic light-emitting layer between two electrodes. When the positive and negative electrons crush in the organic material, the light emits. Each pixel of an OLED is composed of three sub-pixels, RGB, which light up when power is on. The brightness of each sub-pixel can be controlled by adjusting the voltage [86, 87, 88]. The brightness of the three colors can be mixed in different proportions to show the desired color. Since OLED lights themselves, no backlight source is needed, and each pixel can be independently controlled, enabling pixel-level light control. Compared with the LCD, OLED can provide pure black, and achieve no light leakage and perfect contrast. Besides, OLED screens have very short response time when switching between colors, with almost no drag. OLED screens are much thinner than LCD screens and can be bent considerably. OLED screens consume little power by switching on and off each pixel independently without all pixels working together. However, the lifespan of OLED is generally shorter than LCD by aging and burning problems due to the self-lighting mechanism. Moreover, there is an obvious strobe in low brightness, resulting in visual fatigue by pulse width modulation (PWM) dimming [89, 90]. Commercial cases include Epson is using their own OLED for AR glasses and SONY’s OLED is applied for Nreal glasses [91].

Figure 15.

Working principle of OLED [81].

3.3 QLED

Quantum-dot light-emitting diode (QLED) works based on the principle of quantum dots, which is to place quantum dots on a flat surface of a display and then use a control circuit to display the pictures, as shown in Figure 16. Quantum dots (QDS) have excellent optical properties, including continuously tunable peak positions of whole-spectrum luminescence, high color purity, and good stability, which are excellent luminescence and photoelectric materials. QLED display is built on these special properties of QDS to achieve high performance and low cost. Compared with OLED, QLED can provide higher brightness and a wider spectrum, and the size is flexible with longer lifetime [92, 93, 94]. However, OLED has more advantages in no light leakage, shorter response time, and wider visible view. Besides, OLEDs can provide a purer black display, better contrast, lighter and thinner, less power consuming, and performs better at night.

Figure 16.

Working principle of QLED [92].

3.4 LCOS

Liquid Crystal on Silicon (LCOS) is a new reflective display technology combining LCD and CMOS integrated circuit. The birefringence characteristic of the liquid crystal molecule itself is applied for the light controlling. And the polarization of the incident light is modulated by the switch of the circuit to promote the rotation of the liquid crystal molecule. As shown in Figure 17, when the applied voltage of the LCD layer pixel is zero, the light does not enter the projection light path and there is no light output, that is, the pixel presents a dark state. When there is an applied voltage in the pixel, the bright is state, so the image is displayed on the screen. The voltage applied at both ends of the pixel will affect the optical performance of the liquid crystal molecule, and then determine the gray scale of the pixel. Its advantage is that it is mature and cheap, and its pixel density is relatively high, and the overall energy rate is also relatively high [95, 96]. Its disadvantages mainly lie in its relatively low contrast, especially at large incident angles, and it must be used in conjunction with PBS, which limits the miniaturization process of the overall optical system, and it cannot work at low temperature [95, 97, 98]. Mini Glass and Magic Leap use LCOS for the AR glasses.

Figure 17.

Working principle of LCOS [95].

3.5 Comparison

Table 3 shows the performance comparison among existing projector technologies. However, the state of the art cannot achieve ideal performance for better imaging quality in the optical see-through near-eye display. Table 4 shows the performance gap between the ideal condition and the current craft. There is still a long way to go.

LCDOLEDQLEDLCOS
SubstrateGlassGlass/polymerGlassGlass
Material of luminescent layerLED/CCFLOrganic light emitting materialQuantum dot luminescent materialsLED/CCFL
PolaroidLine polaroid*2Circular polaroidLine polaroid*2Line polaroid*2
Backlight moduleYesNoYesYes
Color filterYesNoYesYes
Light panelYesNoYesYes
Emitting principleBacklight
(inactive)
Organic layer(active)QD (inactive)Backlight
(inactive)
Contrast>10,000:1>1,000,000:1>1,000,000:1>1500:1
Angle of visibility<150°, chromatic aberration~180°, no aberration~180°, no aberration~180°, no aberration
Power consumptionHighLowLowHigh
Thickness>1.2 mm<1.5 mm<1.5 mm>1.2 mm
Service lifeLongShortShortLong
CostLowHighHighLow
Mobile phone screenYesYesYesNo
Flat-panel screenYesYesYesYes
Working temperature20°C ~ 60°C−40°C ~ 80°C40°C ~ 80°CNone
Impact on environmentSmallBigSmallSmall
TFT needed for each pixel1211
Toxic potentialsNoYesNoNo
Spatial color uniformityGreatGoodGreatGood
Light propagationTransmissionTransmissionTransmissionBirefringence

Table 3.

Comparison among existing projector technologies.

Key MeritsIdealStatus quo
Brightness105~106 nits104 nits
Contrast Ratio (ANSI)>300:1<100:1
Refresh rate>75 Hz60Hz
Resolution>60 pixel/DEG30 pixel/DEG
Power consumption<50 mW100mW
Endurance−55°C ~ 100°C−10 °C ~ 60 °C
Form factorDriver integrated, panel no larger than screenPanel about 2*screen area
Emitting angleComply with exit pupilLambertian

Table 4.

Performance gap.

Advertisement

4. Conclusions

In recent decades, optical see-through near-eye display has undergone significant progress and facilitated various AR applications, especially for the industrial practice. In this chapter, the display systems used by AR devices on the market, including various optical imaging systems and projecting systems, have been summarized. Among them, the difference in optical combinators is the key part to distinguish the AR display systems. The advantages and disadvantages of each optical technology are compared. Besides, the characteristics of the projectors are introduced and clarified its influence on the imaging performance and quality of the images. The gap to ideal performance is presented for the future development of relevant technologies.

Advertisement

Acknowledgments

This publication is partially supported by the Science Foundation Ireland under Grant Number 15/RP/B3208 and EIT Manufacturing 2022 KAVA 22125. For the purpose of open access, the author has applied a CC BY public copyright license to any Author Accepted Manuscript version arising from this submission. The acknowledgement also goes to the “111” Project by the State Administration of Foreign Experts Affairs and the Ministry of Education of China (No. B07014).

References

  1. 1. Berryman DR. Augmented reality: A review. Medical Reference Services Quarterly. 2012;31(2):212-218
  2. 2. Azuma RT. A survey of augmented reality. Presence: Teleoperators & Virtual Environments. 1997;6(4):355-385
  3. 3. Ong S, Yuan M, Nee A. Augmented reality applications in manufacturing: A survey. International Journal of Production Research. 2008;46(10):2707-2742
  4. 4. Zabels R, Osmanis K, Narels M, Gertners U, Ozols A, Rutenbergs K, et al. AR displays: Next-generation technologies to solve the vergence-accommodation conflict. Applied Science. 2019;9(15):3147
  5. 5. Hua H, Javidi B. Augmented reality: Easy on the eyes. Optics & Photonics News. 2015;26(2):26
  6. 6. Zhou Y, Zhang J, Fang F. Advances in the design of optical see-through displays. Advanced Optical Technologies. 2020;9:167-186
  7. 7. Rolland JP. Wide-angle, off-axis, see-through head-mounted display. Optical Engineering. 2000;39(7):1760-1767
  8. 8. Chi S, Yin Y, Yaoyuneyong G, Johnson E. Augmented reality: An overview and five directions for AR in education. Journal of Educational Technology Development & Exchange. 2011;119(4):119-140
  9. 9. Yuen SC-Y, Yaoyuneyong G, Johnson E. Augmented reality: An overview and five directions for AR in education. Journal of Educational Technology Development and Exchange (JETDE). 2011;4(1):11
  10. 10. Radu I. Augmented reality in education: A meta-review and cross-media analysis. Personal & Ubiquitous Computing. 2014;18(6):1533-1543
  11. 11. Lee K. Augmented reality in education and training. TechTrends. 2012;56(2):13-21
  12. 12. Moro C, Štromberga Z, Raikos A, Stirling A. The effectiveness of virtual and augmented reality in health sciences and medical anatomy. Anatomical Sciences Education. 2017;10(6):549-559
  13. 13. Tang S-L, Kwoh C-K, Teo M-Y, Sing NW, Ling K-V. Augmented reality systems for medical applications. IEEE Engineering in Medicine and Biology Magazine. 1998;17(3):49-58
  14. 14. Sielhorst T, Feuerstein M, Navab N. Advanced medical displays: A literature review of augmented reality. Journal of Display Technology. 2008;4(4):451-467
  15. 15. Kamphuis C, Barsom E, Schijven M, Christoph N. Augmented reality in medical education? Perspectives on Medical Education. 2014;3(4):300-311
  16. 16. Han DI, Jung T, Gibson A. Dublin AR: Implementing augmented reality in tourism. In: Information and Communication Technologies in Tourism. Cham: Springer; 2013, 2014. pp. 511-523
  17. 17. Kysela JÍ, Torková P. Using augmented reality as a medium for teaching history and tourism. Procedia – Social and Behavioral Sciences. 2015;174:926-931
  18. 18. Danielsson O, Holm M, Syberfeldt A. Augmented reality smart glasses in industrial assembly: Current status and future challenges. Journal of Industrial Information Integration. 2020;20:100175
  19. 19. Livingston MA, Rosenblum LJ, Julier SJ, Brown D, Baillot Y, Swan II, et al. An augmented reality system for military operations in urban terrain. In: Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC ‘02), Orlando, FL, December 2-5, 2002. 2002
  20. 20. Livingston MA, Rosenblum LJ, Brown DG, Schmidt GS, Julier SJ, Baillot Y, et al. Military applications of augmented reality. Handbook of Augmented Reality. 2011:671-706
  21. 21. Mejías Borrero A, Andújar MJ. A pilot study of the effectiveness of augmented reality to enhance the use of remote labs in electrical engineering education. Journal of Science Education and Technology. 2012;21(5):540-557
  22. 22. Piekarski W, Thomas B. ARQuake: The outdoor augmented reality gaming system. Communications of the ACM. 2002;45(1):36-38
  23. 23. Stapleton C, Hughes C, Moshell M, Micikevicius P, Altman M. Applying mixed reality to entertainment. Computer. 2002;35(12):122-124
  24. 24. Barakonyi I, Schmalstieg D. Augmented reality agents in the development pipeline of computer entertainment. In: International Conference on Entertainment Computing. Berlin, Heidelberg: Springer; 19 Sep 2005. pp. 345-356
  25. 25. Ren D, Goldschwendt T, Chang Y, Höllerer T. Evaluating wide-field-of-view augmented reality with mixed reality simulation. In: 2016 IEEE Virtual Reality (VR). IEEE; 19 Mar 2016. pp. 93-102
  26. 26. Lanman D, Luebke D. Near-eye light field displays. ACM Transactions on Graphics (TOG). 2013;32(6):1-10
  27. 27. Kim S-B, Park J-H. Optical see-through Maxwellian near-to-eye display with an enlarged eyebox. Optics Letters. 2018;43(4):767-770
  28. 28. Thibos L, Bradley A, Still D, Zhang X, Howarth P. Theory and measurement of ocular chromatic aberration. Vision Research. 1990;30(1):33-49
  29. 29. Campbell FW, Gubisch RW. The effect of chromatic aberration on visual acuity. The Journal of Physiology. 1967;192(2):345-358
  30. 30. Seidemann A, Schaeffel F. Effects of longitudinal chromatic aberration on accommodation and emmetropization. Vision Research. 2002;42(21):2409-2417
  31. 31. Teo PC, Heeger DJ. Perceptual image distortion. In: Proceedings of 1st International Conference on Image Processing. Vol. 2. IEEE; 13 Nov 1994.pp. 982-986
  32. 32. Schacter DL, Guerin SA, Jacques PLS. Memory distortion: An adaptive perspective. Trends in Cognitive Sciences. 2011;15(10):467-474
  33. 33. Nicholls JI. The measurement of distortion: Theoretical considerations. The Journal of Prosthetic Dentistry. 1977;37(5):578-586
  34. 34. Gu L, Cheng D, Wang Q , Hou Q , Wang Y. Design of a two-dimensional stray-light-free geometrical waveguide head-up display. Applied Optics. 2018;57(31):9246-9256
  35. 35. Zhou Y, Zhang J, Fang F. Stray light analysis and design optimization of geometrical waveguide. Advanced Optical Technologies. 2021;10(1):71-79
  36. 36. Stevens JC, Stevens SS. Brightness function: Effects of adaptation. JOSA. 1963;53(3):375-385
  37. 37. Lotto RB, Purves D. The effects of color on brightness. Nature Neuroscience. 1999;2(11):1010-1014
  38. 38. Arend LE, Spehar B. Lightness, brightness, and brightness contrast: 1. Illuminance variation. Perception & Psychophysics. 1993;54(4):446-456
  39. 39. Starck J-L, Murtagh F, Candès EJ, Donoho DL. Gray and color image contrast enhancement by the curvelet transform. IEEE Transactions on Image Processing. 2003;12(6):706-717
  40. 40. Hendrick RE, Mark HE. Basic physics of MR contrast agents and maximization of image contrast. Journal of Magnetic Resonance Imaging. 1993;3(1):137-148
  41. 41. Ji T-L, Sundareshan MK, Roehrig H. Adaptive image contrast enhancement based on human visual properties. IEEE Transactions on Medical Imaging. 1994;13(4):573-586
  42. 42. Zhou Y, Zhang J, Fang F. Vergence-accommodation conflict in optical see-through display: Review and prospect. Results in Optics. 1 Dec 2021;5:100160
  43. 43. Zhou L, Chen CP, Wu Y, Zhang Z, Wang K, Yu B, et al. See-through near-eye displays enabling vision correction. Optics Express. 2017;25(3):2130-2142
  44. 44. Lee S, Cho J, Lee B, Jo Y, Jang C, Kim D, et al. Foveated retinal optimization for see-through near-eye multi-layer displays. IEEE Access. 2017;6:2170-2180
  45. 45. Park J-H, Kim S-B. Optical see-through holographic near-eye-display with eyebox steering and depth of field control. Optics Express. 2018;26(21):27076-27088
  46. 46. Fontana U, Cutolo F, Cattari N, Ferrari V. Closed–loop calibration for optical see-through near eye display with infinity focus. In: 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE; 16 Oct 2018. pp. 51-56
  47. 47. Akşit K, Lopes W, Kim J, Shirley P, Luebke D. Near-eye varifocal augmented reality display using see-through screens. ACM Transactions on Graphics (TOG). 2017;36(6):1-13
  48. 48. Zheng Z, Liu X, Li H, Xu L. Design and fabrication of an off-axis see-through head-mounted display with an x–y polynomial surface. Applied Optics. 2010;49(19):3661-3668
  49. 49. Muensterer OJ, Lacher M, Zoeller C, Bronstein M, Kübler J. Google glass in pediatric surgery: An exploratory study. International Journal of Surgery. 2014;12(4):281-289
  50. 50. Knight HM, Gajendragadkar PR, Bokhari A. Wearable technology: Using Google glass as a teaching tool. Case Reports. 2015;2015:bcr2014208768
  51. 51. Deshpande S, Uplenchwar G, Chaudhari D. Google glass. International Journal of Scientific & Engineering Research. 2013;4(12):1-4
  52. 52. Kress B. Optical design of next-gen augmented reality: Ahead of chairing the AR/VR/MR industry days at SPIE photonics west, Microsoft's Bernard Kress considers the optimal way of designing optics for augmented and mixed reality headsets. Electro Optics. 2019;290:12-17
  53. 53. Wei L, Li Y, Jing J, Feng L, Zhou J. Design and fabrication of a compact off-axis see-through head-mounted display using a freeform surface. Optics Express. 2018;26(7):8550-8565
  54. 54. Fang F, Zhang X, Weckenmann A, Zhang G, Evans C. Manufacturing and measurement of freeform optics. CIRP Annals. 2013;62(2):823-846
  55. 55. Cheng D, Wang Y, Xu C, Song W, Jin G. Design of an ultra-thin near-eye display with geometrical waveguide and freeform optics. Optics Express. 2014;22(17):20705-20719
  56. 56. Wilson A, Hua H. Design and demonstration of a Vari-focal optical see-through head-mounted display using freeform Alvarez lenses. Optics Express. 2019;27(11):15627-15637
  57. 57. Wang Y, Cheng D, Xu C. Freeform optics for virtual and augmented reality. In: International Optical Design Conference. Optica Publishing Group; 9 Jul 2017. p. JTu3A-1
  58. 58. Fuge M, Yumer ME, Orbay G, Kara LB. Conceptual design and modification of freeform surfaces using dual shape representations in augmented reality environments. Computer-Aided Design. 2012;44(10):1020-1032
  59. 59. Nikolov DK, Bauer A, Cheng F, Kato H, Vamivakas AN, Rolland JP. Metaform optics: Bridging nanophotonics and freeform optics. Science. Advances. 2021;7(18):eabe5112
  60. 60. Huang H, Hua H. High-performance integral-imaging-based light field augmented reality display using freeform optics. Optics Express. 2018;26(13):17578-17590
  61. 61. Zhou Y, Zhang J, Fang F. Design of a large field-of-view two-dimensional geometrical waveguide. Results in Optics. 2021;5:100147
  62. 62. Lee Y-H, Yin K, Wu S-T. Reflective polarization volume gratings for high efficiency waveguide-coupling augmented reality displays. Optics Express. 2017;25(22):27008-27014
  63. 63. Erdenebat M-U, Lim Y-T, Kwon K-C, Darkhanbaatar N, Kim N. Waveguide-type head-mounted display system for AR application. State of the art Virtual Reality and Augmented Reality Knowhow. 2018;2018:41
  64. 64. Yoshida T, Tokuyama K, Takai Y, Tsukuda D, Kaneko T, Suzuki N, et al. A plastic holographic waveguide combiner for light-weight and highly-transparent augmented reality glasses. Journal of the Society for Information Display. 2018;26(5):280-286
  65. 65. Waldern JD, Grant AJ, Popovich MM. DigiLens switchable Bragg grating waveguide optics for augmented reality applications. In: Digital Optics for Immersive Displays. Vol. 10676. SPIE; 21 May 2018. pp. 108-123
  66. 66. Shin DH, Tibuleac S, Maldonado TA, Magnusson R. Thin-film optical filters with diffractive elements and waveguides. Optical Engineering. 1998;1998:37
  67. 67. Roelkens G, Vermeulen D, Van Thourhout D, Baets R, Brision S, Lyan P, et al. High efficiency diffractive grating couplers for interfacing a single mode optical fiber with a nanophotonic silicon-on-insulator waveguide circuit. Applied Physics Letters. 2008;92(13):131101
  68. 68. Chen CP, Mi L, Zhang W, Ye J, Li G. Waveguide-based near-eye display with dual-channel exit pupil expander. Displays. 2021;67:101998
  69. 69. Akshaykore. Fundamentals of display technologies for Augmented and Virtual Reality. 2018. [Available from: https://hackernoon.com/fundamentals-of-display-technologies-for-augmented-and-virtual-reality-c88e4b9b0895
  70. 70. Zhang Y, Fang F. Development of planar diffractive waveguides in optical see-through head-mounted displays. Precision Engineering. 2019;60
  71. 71. Palmer C, Loewen EG. Diffraction grating handbook. In 2004, Spectra-Physics, Including Richardson Gratings, Corion Filters, Hilger Crystals and Oriel Instruments, was acquired by Newport Corporation. 2005
  72. 72. Palmer E, Hutley M, Franks A, Verrill J, Gale B. Diffraction gratings (manufacture). Reports on Progress in Physics. 1975;38(8):975
  73. 73. Botten I, Craig M, McPhedran R, Adams J, Andrewartha J. The dielectric lamellar diffraction grating. Optica Acta: International Journal of Optics. 1981;28(3):413-428
  74. 74. Kress BC, Meyrueis P. Applied Digital Optics: From Micro-Optics to Nanophotonics. John Wiley & Sons; 4 Nov 2009
  75. 75. Zhou Y, Zhang J, Fang F. Design of a dual-focal geometrical waveguide near-eye see-through display. Optics & Laser Technology. 2022;156:108546
  76. 76. Maimone A, Lanman D, Rathinavel K, Keller K, Luebke D, Fuchs H. Pinlight displays: Wide field of view augmented reality eyeglasses using defocused point light sources. In: ACM SIGGRAPH 2014 Emerging Technologies. 27 Jul 2014. p. 1
  77. 77. Kramida G. Resolving the vergence-accommodation conflict in head-mounted displays. IEEE Transactions on Visualization and Computer Graphics. 2015;22(7):1912-1931
  78. 78. Li Q , Deng H, Pang S, Jiang W, Wang Q. A reflective augmented reality integral imaging 3D display by using a mirror-based pinhole array. Applied Sciences. 2019;9(15):3124
  79. 79. Katsumata Y, Yamada W, Manabe H. Optical see-through head-mounted display with deep depth of field using pinhole polarizing plates. In: The Adjunct Publication of the 32nd Annual ACM Symposium on User Interface Software and Technology. 14 Oct 2019. pp. 102-104
  80. 80. Lee YH, Tan G, Yin K, Zhan T, Wu ST. Compact see-through near-eye display with depth adaption. Journal of the Society for Information Display. 2018;26(2):64-70
  81. 81. Yang J, Twardowski P, Gérard P, Fontaine J. Design of a large field-of-view see-through near to eye display with two geometrical waveguides. Optics Letters. 2016;41(23):5426-5429
  82. 82. Hu X, Baena FRY, Cutolo F. Alignment-free offline calibration of commercial optical see-through head-mounted displays with simplified procedures. IEEE Access. 2020;8:223661-223674
  83. 83. Chen H-W, Lee J-H, Lin B-Y, Chen S, Wu S-T. Liquid crystal display and organic light-emitting diode display: Present status and future perspectives. Light: Science & Applications. 2018;7(3):17168
  84. 84. Berreman D, Heffner W. New bistable cholesteric liquid-crystal display. Applied Physics Letters. 1980;37(1):109-111
  85. 85. Hasebe H, Takatsu H, Iimura Y, Kobayashi S. Effect of polymer network made of liquid crystalline diacrylate on characteristics of liquid crystal display device. Japanese journal of Applied Physics. 1994;33(11R):6245
  86. 86. Geffroy B, Le Roy P, Prat C. Organic light-emitting diode (OLED) technology: Materials, devices and display technologies. Polymer International. 2006;55(6):572-582
  87. 87. Gather MC, Köhnen A, Meerholz K. White organic light-emitting diodes. Advanced Materials. 2011;23(2):233-248
  88. 88. Kay-khosa. OLED Display and It's Types. 2017. [Available from: https://steemit.com/steemstem/@kay-khosa/oled-display-and-it-s-types
  89. 89. Tyan Y-S. Organic light-emitting-diode lighting overview. Journal of Photonics for Energy. 2011;1(1):011009
  90. 90. Kalinowski J. Organic Light-Emitting Diodes: Principles, Characteristics & Processes. CRC Press; 3 Oct 2018
  91. 91. Xu C. Nreal: Ready-to-wear mixed reality glasses. In: Spie avr21 Industry Talks II. Vol. 11764. SPIE; 29 Mar 2021. p. 1176409
  92. 92. Moynihan T. What are quantum dots, and why do I want them in my TV. WIRED com. 2015
  93. 93. Jang HJ, Lee JY, Kwak J, Lee D, Park J-H, Lee B, et al. Progress of display performances: AR, VR, QLED, OLED, and TFT. Journal of Information Display. 2019;20(1):1-8
  94. 94. He L, Fei M, Chen J, Tian Y, Jiang Y, Huang Y, et al. Graphitic C3N4 quantum dots for next-generation QLED displays. Materials Today. 2019;22:76-84
  95. 95. Zhang Z, You Z, Chu D. Fundamentals of phase-only liquid crystal on silicon (LCOS) devices. Light: Science & Applications. 2014;3:e213
  96. 96. Lazarev G, Hermerschmidt A, Krüger S, Osten S. LCOS Spatial Light Modulators: Trends and Applications. Optical Imaging and Metrology: Advanced Technologies. 22 Aug 2012. p. 1
  97. 97. Jülch V. Comparison of electricity storage options using levelized cost of storage (LCOS) method. Applied Energy. 2016;183:1594-1606
  98. 98. Vettese D. Liquid crystal on silicon. Nature Photonics. 2010;4(11):752-754

Written By

Jufan Zhang, Yao Zhou and Fengzhou Fang

Submitted: 13 September 2022 Reviewed: 27 September 2022 Published: 22 October 2022