Open access peer-reviewed chapter

Novel Floating and Auto-stereoscopic Display with IRLED Sensors Interactive Virtual Touch System

Written By

Jian-Chiun Liou

Submitted: 04 November 2014 Reviewed: 24 June 2015 Published: 07 October 2015

DOI: 10.5772/61113

From the Edited Volume

Optoelectronics - Materials and Devices

Edited by Sergei L. Pyshkin and John Ballato

Chapter metrics overview

1,522 Chapter Downloads

View Full Metrics

Abstract

A wide range of the types of interactive virtual touch system have been in research and development. As displayed by the trends, users do not need any special equipment, can interact with the images, and under normal circumstances of the interactive nature, saves trouble. We have been studying relevant video interactive systems in which a virtual image, like in the real world, exists to display objects. We developed a floating display and the principle, which is based on an interactive video system, to enable more realistic auto-stereoscopic images.

Keywords

  • Virtual
  • Interactive
  • Floating image
  • Auto-stereoscopic

1. Introduction

This study shows that the interface does not require any keyboard input and/or a high precision interactive device operation. The purpose of the study is for the actual object image through the optical lens system design to go beyond 5 cm from the system of the drift image. Outside the optical system, in about 50 cm in front of the viewer, a pop-up image can be observed at a viewing angle of 15°. Viewers can use a non-contact touch through IR LED arrays shown in Figure 1 A.

The general contact system with touch-sensitive keys may cause contamination, and germs can potentially spread and reproduce through contact. The development of a touch sensing technology and multi-touch operation technology has gradually been used in all kinds of electrical products for everyday use. Because of privacy requirements, research and development staffs have developed personal information security needs and different ways to deal with the different system architectures [1-6].

As a subsequent introduction, we mainly focus on the common use of electronic equipment in public places: ATM buttons, health and other equipment, and hospital elevator buttons; the system does not touch the electronic products since the operation principle of the function keys can be positioned. Different aspects that are addressed further include a non-touch system to explore the given potential. There are now a popular variety of touch devices, which are necessary to control systems through non-touch input texts, which may be useful in their daily lives in the future.

In the current non-contact control system, the system is set up as a matrix and sent by the infrared - receiver modules. The infrared detector arrays are generally formed by interrupting, and the infrared ray is used to deposit the position of the object. However, in the infrared transmitter, detecting the receiver module current means the infrared transmitting and receiving modules continue to be touched because the object interrupts the corresponding position in the infrared[7-9].

Figure 1.

(A). Floating image with interactive touch system. Figure 1(B). Multi-view auto-stereoscopic system.

The LCD flat panel display has become the world's leading product. In view of this phenomenon, the trend that dominated the stereo system is the LCD type, giving the viewer a disparity in Figure 1B and the final 3D effect. The 3D effect is assigned a different image, a different view of the area showing the direction of the optical device.

Some studies on multi-view 3D displays have been developed a while back. Multi-view stereo displays are usually followed up either by the integral imaging method [10-12] or the parallax barrier method [13]. All of these existing methods display a 3D image on a flat screen, and the image cannot be superimposed in real space.

Our interactive imaging studies may be used to pop up in real space through a floating 3D image, and the user can see the floating images without using glasses. Superimposing the projection methods using a real space image projection display has been proposed [14-16]. Based on a non-planar image, users can actually touch the projected image device using these methods, thereby allowing the user objects to experience the real projection surface interaction. However, such a 2D image projection of nature cannot produce multi-view parallax images. In this study, our goal is to create a real image of the touch system and air-infrared, interactive, virtual image with LED sensor.

Advertisement

2. Architecture

Since the fictitious appearance is formed in air, the user can locate the three-dimensional position through the direct contact with the air touch virtual panel. Infrared LED illumination is made in a cylindrical lens mounted in front of a fictitious image as shown in Figure 2.

The fictitious appearance is located in a plurality of sub-regions, which belongs to the operator interface. The object is touched in a sub-region that corresponds by send - receiving. For the corresponding operation function block to perform, the sub-region must be touched. A synchronous processing system is electrically connected to the corresponding transmission - reception. The system is arranged in the same direction of the object sensing delay enabled devices, and the synchronous processing system is configured to synchronize the control settings for each unit.

To solve mishandling, which continuously trigger a current with the same key issues, the operator was asked to correct the perceived image on the virtual panel in the right viewing area. Out of the region, observers perceive only a partial view of the blind or image map. We use the delay function and synchronization to control the virtual panel of the system.

Figure 2.

Virtual Touch system with IR LED device array

Novel floating and auto-stereoscopic display with IR LED sensors interactive virtual touch system include two major parts. The first part is the floating image and auto-stereoscopic display. The second part is the design and fabrication of the IR LED sensors multiplexer(mux) system.

2.1. Floating image and auto-stereoscopic display

In recent years, the three-dimensional television market grew quickly. To add, it is included the other three-dimensional floating image systems. The stereo image is shown in Figure 3.

For this reason, the system constructed a floating color image projecting device. According to principles of geometric optics, through a set of lenses or concave mirror and 3D objects that project true 3D images, we can achieve the principles of purpose. Therefore, it uses a display method to replace so that the image is changing the wishes of necessary real 3D objects. However, the two-dimensional image floating image, rather than the three-dimensional real image, has been achieved. Therefore, we propose the free space of the three strategic projects with a wide-angle auto-stereoscopic floating picture below.

Figure 3.

The floating image products with a virtual image.

We propose a floating image display apparatus in Figure 4 that has two main parts: (1) an image appearance system, and (2) an auto-stereoscopic display.

Figure 4.

Sensors array in the appearance image system.

An optical projection system of this type comprises a projection lens, illumination optics in front of the projection lens, and one light modulator, such as a digital mirror device (DMD). The light modulator allows the reflection of the light in a controllable manner to different directions, such as when the ON state light cones, in which the light is projected onto the projection lens, and the OFF state light cones are formed. In this research, the optical projection system is shown in Figure 5.

Figure 5.

The projection lens and one light modulator.

The designed microlens is shown in Figure 6 such that the width of the region through the microlens at a certain viewing zone may be observed as equal to or smaller than the LCD pixel array in a sub-pixel. You can predict high resolution image when viewed from a viewing area to the center of the area when the rays pass through the lenticular center.

Figure 6.

Designed microlens.

The image generating auto-stereoscopic display and its parameters include the viewing distance and the viewing area. In general, the pitch of the viewport to match the eyes and the viewing distances of the monitor determines the purpose. In this apparatus, in particular, the stereoscopic image is viewed by the projection lens shown in Figures 7, 8, 9, and 10. Thus, it illustrates the details of the theory and methods.

Figure 7.

Twelve lens array.

A floating real image between the projection lens and the user was set. Moreover, by using the depth detecting module to identify the user position and the control unit electrically connected to the image generator, the projection lens is set. And the depth detecting module, the image generator, and the projection lens set can adjust the position of the floating real image according to the user position.

Figure 8.

Top view of twelve lens array.

Figure 9.

Keypad module.

Regarding the keypad module, it should be noted that by utilizing the control unit and the depth detecting module, the display apparatus in the disclosure can provide a user-friendly operation and a realistic interaction for the user as shown in Figure 10. Specifically, the control unit steers the movement of the image generator according to the position of the user detected by the depth detecting module, so as to adjust the relative positions of the image generator and the projection lens set, the position of the floating real image, and the size of the floating real image. The depth detecting module detects the position of the user, and may also detect the position of the body of the user, or the position of the fingers of the user touching the floating real image.

The depth detecting module feeds the detected position information of the user. The control unit can calculate the position of the user, the position of the floating real image, and the required size of the floating real image. Accordingly, the image generator and/or the projection lens set are shifted by corresponding distances to achieve the needed image variation effects.

Figure 10.

The stereoscopic images through the projected lenses.

2.2. Design and fabrication of IR LED sensors multiplexer (mux) system

We propose an auto-stereoscopic display and an interactive technology locker door floating image device as shown in Figure 8. The pop-up appearance system has two main parts: (1) an image pop-up system, and (2) two sights of auto-stereoscopic display. In these two parts, the audiences can observe the appearance of 3D images in the air with a disparity from a wide viewing angle. The purpose of the image is to float in a free space that is accessible to the viewer's hand. Accordingly, the touch and interactive technologies can include free space to enhance system capabilities. The new device can have many applications, especially for products and information kiosks in public places.

The IR LED sensors system is shown in Figure 11, with the non-touch positioning sensing devices using infrared (IR) LED and proximity sensor. The signals, from a set of proximity sensors, are used to determine the position of the hand through the trilateration method. The distances between the hand and sensors can be obtained from proximity sensors. The positioning results of the hand are verified over mobile devices. Therefore, the system allows a user to intuitively interact with the mobile device real time. This design can waive the necessity of touching the panel or screen of the mobile device.

Figure 11.

IR LED sensors system.

The multiplexer data control system of the shift register provided a sequence signal. All of the IR LED sensors architecture methods are shown in Figure 12. The floating image is a pop-up space located between the IR LEDs array and sensors array.

Given several mode designs, we have to solve the noise cross-talk between channels and the energy consumption. On the first mode, we designed an “even group and odd group” circuit architecture that is shown in Figure 13. Additionally, the waveform is shown in Figure 14.

The second addressable type of the design is an interval of two elements on different time sequence driving architecture. The architecture shows a sequence that involves the 1st, 4th, 7th; 2nd, 5th, 8th; 3rd, 6th, 9th cyclical signal output controlled elements as shown in Figure 15. The waveform is shown in Figure 16.

Figure 12.

IR LED Sensors multiplexer (mux) system.

Figure 13.

Even and odd group output in different time cycle.

Figure 14.

The waveform of even and odd elements.

Figure 15.

An interval of two elements on a different time driving.

Figure 16.

The waveform of interval of two elements.

Thirdly, there is an interval of three elements on a different time driving. The architecture shows a sequence that involves the 1st, 5th, 9th; 2nd, 6th, 10th; 3rd, 7th, 11th; 4th, 8th, 12th cyclical signal output controlled elements as shown in Figure 17. The waveform is shown in Figure 18.

Figure 17.

An interval of three elements on a different time driving.

Every shift register is used to receive part of the timing counting signal and combined with an odd-even number selection mechanism or a specific number sequence (e.g. 1, 4, 7) element mechanism to generate a set of enabling signals as shown in Figure 19. In this manner, the driving control of the present element circuits can be reached via any random combination of the address signal and each set of enabling signals, i.e., the driving of each element circuit is fulfilled via the control of an address signal and one set of the enabling signals. This arrangement is able to prevent generating erroneously triggering signals in the original due to the mutual interference of the IR LED sensors circuits.

Figure 18.

The waveform of the interval of three elements.

Figure 19.

A set of enabling signals.

Advertisement

3. Simulation results

The scanning sequence of the multiplexer is shown in Figure 20. An interleaving data encoding sequence is labeled in red and blue arrows respectively. The inference between neighboring elements is also simulated in Figure 20. The output voltage noise of the third element is lower than half of the level of the second element. The error signal excited by the first elements is significantly reduced.

Figure 20.

Simulated data encoding sequence for medical ultrasound multiplexer.

The control circuit on IR LED sensor elements verifies addressing elements by a multiplexer logic simulation. There are four kinds of printing tape: type 1 is a sequence of data input for IR LED sensor elements 1, 2, 3,...up to IR LED sensor elements n; type 2 is a sequence of data encoding for IR LED sensor elements n, (n-2), (n-4),...down to final IR LED sensor elements 2; type 3 is a sequence for IR LED sensor elements 1, 3, 5,...up to final IR LED sensor elements; and type 4 is a sequence for IR LED sensor elements (n-1), (n-3), (n-5),....down to final IR LED sensor elements. The driver system design has to simulate the function work, and we used FPGA to verify this.

A novel IR LED sensor elements multiplexer data registration system has been designed and simulated. The controller can be easily adapted to different sizes of the IR LED sensor elements without much hardware arrangement. The simulated results show that the proposed multiplexer controller can reduce the noise interference caused by the excitation of neighboring channels as shown in Figure 21.

Figure 21.

Simulated voltage inferences among the IR LED sensor elements.

A two-dimensional data control system is shown in Figure 22. Under the two-dimensional circuit configurations, a shifting sub-circuit uses a special control method to generate the enabling signals in a serial-in and parallel-out manner. The special control method connects an output of a n-th D flip-flop from the output end to a trigger end of a (n+2)-th D flip-flop.

Figure 22.

Two-dimensional data control system.

The multi-dimensional data control system is shown in Figure 23. For the same architecture, the three-dimensional data control system method of the shifting circuit connects an output of the n-th D flip-flop from the output end to a trigger end of a (k+3)-th D flip-flop.

Figure 23.

Multi-dimensional data control system.

Advertisement

4. Integrated system

Figure 24 shows a floating image of the virtual touch system photo. It is the top view of the floating image system. Figure 25 is a cross view of the floating image system. The fictitious image is formed by a focus lens.

Figure 24.

Top view of the floating image system.

Figure 25.

Cross view of the floating image system.

Advertisement

5. Discussion and result

The integrated infrared light sensor module, a floating 3D image architecture and image display is shown in Figures 26 and 27. The image is 20 cm away from the system, and viewers could observe the image to be about 50 cm within the best viewing distance. In addition to displaying three-dimensional images, the application is also a virtual keyboard for the telephone, as shown in Figure 28. Thus, the device can be used to interact with the user interface function. The virtual system displays its application to the locker room door as shown in Figure 29.

Figure 26.

IR LED sensors module.

Figure 27.

Floating 3D image architecture.

Figure 28.

Virtual keypad for telephone.

Figure 29.

Virtual systems for the door locker.

References

  1. 1. Han, J.Y., Low-cost multi-touch sensing through frustrated total internal reflection. ACM UIST, pp. 115-118,(2005).
  2. 2. Kitamura, Y., Nakayama, T., Nakashima, T., and Yamamoto, S., The illusionhole with polarization filters. ACM VRST, p. 244-251, (2006).
  3. 3. Chan, L.W., Chuang, Y.F., Chia, Y.W., Hung, Y.P., and Jane Hsu., A new method for multi-finger detection using a regular diffuser. International Conference on Human-Computer Interaction, (2007).
  4. 4. Chan, L.W., Chuang, Y.F., Yu, M.C., Chao, Y.L., Lee, M.S., Hung, Y.P. and Jane Hsu, Gesture-based interaction for a magic crystal ball. ACM VRST, pp. 157-164(2007).
  5. 5. Wilson, A. TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction. International Conference on Multimodal Interfaces, pp. 69-76, (2004).
  6. 6. Morris, M.R., Morris, D., and Winograd, T., Individual audio channels with single display groupware: effects on ommunication and task strategy. ACM CSCW, p. 242-251,(2004).
  7. 7. Sugimoto, M., Hosoi, K., and Hashizume, H., Caretta: system for supporting face-to-face collaboration by integrating personal and shared spaces ACM CHI, p. 41-48,(2004).
  8. 8. Kitamura, Y., Osawa, W., Yamaguchi, T., Takemura, H., and Kishino, F., A display table for strategic collaboration preserving private and public information. International Conference on Entertainment Computing, pp. 167-179,(2005).
  9. 9. Kakehi, T., Iida, M., Naemura, T., Shirai, Y., and Matsushita, M., Lumisight table: an interactive view-dependent tabletop display. IEEE Computer Graphics and Applications, January/February, 25(1), p. 48-53,(2005).
  10. 10. G. Lippmann: La photographie integrale, Comptes-Rendus 146, pp. 446.451, (1908).
  11. 11. Roberto Lopez-Gulliver, Shunsuke Yoshida, Sumio Yano, and Naomi Inoue: gCubik : Real-time Integral Image Rendering for a Cubic 3D Display, ACM SIGGRAPH 2009 Emerging Technologies, New Orleans, USA, August 3-7 (2009).
  12. 12. Koike, T., Sakai, H., Shibahara, T., Oikawa, M., Yamasaki, M., and Utsugi, K., Light field copy machine. In ACM SIGGRAPH ASIA 2009 Emerging Technologies: Adaptation (Yokohama, Japan, December 16 – 19, SIGGRAPH ASIA ’09,(2009).
  13. 13. Sakamoto, K., Kimura, R., and Takaki, M.: Parallax polarizer barrier stereoscopic 3D display systems, Active Media Technology, 2005. (AMT 2005). Proceedings of the 2005 International Conference on, pp.469- 474, 19-21 May (2005).
  14. 14. Daisuke Kondo, Toshiyuki Goto, Makoto Kouno, Ryugo Kijima, and Yuzo Takahashi: A Virtual Anatomical Torso for Medical Education using Free From Image Projection, Proc. of the 10th International Conference on Virtual Systems and Multimedia, pp678-685, (2004).
  15. 15. A. Cassinelli and M. Ishikawa: Khronos projector. In SIGGRAPH 2005, Emerging Technologies, (2005).
  16. 16. Y. Kawaguchi: The Art of Gemotion in Space, Information Visualization, 2006. IV 2006. Tenth International Conference on, pp.658-663, 5-7 July (2006).

Written By

Jian-Chiun Liou

Submitted: 04 November 2014 Reviewed: 24 June 2015 Published: 07 October 2015