Open access peer-reviewed chapter

Interaction Design in Virtual Reality Game Using Arduino Sensors

Written By

Juin-Ling Tseng and Chia-Wei Chu

Submitted: 14 May 2017 Reviewed: 15 September 2017 Published: 20 December 2017

DOI: 10.5772/intechopen.71016

From the Edited Volume

Simulation and Gaming

Edited by Dragan Cvetković

Chapter metrics overview

2,280 Chapter Downloads

View Full Metrics

Abstract

Virtual reality (VR) is the use of computer simulation to produce a virtual world, providing users with a variety of sensory simulation, which enables users to feel as though they are in the virtual world. Currently, mature three-dimensional (3D) computer graphics technology can present realistic 3D visual effects. However, system interaction is still mainly through specific interactive devices for system control, such as the Vive controller for HTC Vive. In order to enable the user to control the game intuitively, this study employed a currently popular Arduino technology to carry out design of interactive control devices for virtual reality. The interaction design in this study is based mainly on a virtual reality baseball game. To let users carry out swings more intuitively in the baseball game, this study used actual baseball bat–installed sensors, called “Arduino baseball bat,” as a replacement for the VR joystick. The implemented system was roughly divided into two components: a transmitter system module on the bat and a receiver system module connecting to the server host. According to the results, our system cannot only realistically display 3D visual effects, but the Arduino baseball bat can also provide intuitive real-time game interaction for the user.

Keywords

  • virtual reality
  • interaction design
  • Arduino sensors
  • game design
  • unity

1. Introduction

Today, most games require specific devices for implementation, such as personal computers, mobile phones, tablets, and video game consoles. Although the special effects and image processing are captivating for the games designed for these devices, two major issues exist between the games and the game users. First, users can watch the game scenes only through the display screen, which makes it challenging to produce an immersive gaming experience. Second, users are allowed to manipulate the games only through traditional interactive devices such as the rocker, keyboard, and mouse and not intuitively. To address these two issues, this study used the virtual reality (VR) headset and Arduino somatosensory interaction mechanism for three-dimensional (3D) game development.

1.1. VR headset

Three-dimensional technologies [1, 2, 3, 4] with a wide variety of applications including movies [5], games [6], art [7], and education [8] are commonly used. Among these, 3D VR is the most popular one [9, 10]. To induce users to become more immersed in a VR system, the system generally presents the required 3D scene information through a VR headset. At present, Oculus Rift [11, 12], 3Glasses [13], and HTC Vive [14, 15] are the most popular VR headsets, as shown in Figure 1.

  1. Oculus Rift [11, 12]

    Oculus Rift is a VR head-mounted display (HMD) system developed by Oculus VR. Rift is equipped with a 7-inch screen, with substantially reduced latency and motion blur incurred by pixel switching in the original prototype and by rapid turning of the user’s head. The field of view (FOV) is more than 90° horizontal, and the image resolution is 2160 × 1200. Rift also provides a three-axis gyroscope, accelerometer, and magnetometer, allowing the users to interact with the 3D scene content. Oculus Rift provides a wireless controller called Oculus Touch, which allows users to manipulate 3D scene information. However, Oculus Touch is operated similarly to the general rocker, displaying an absence of the intuitive control concept to a certain extent.

  2. 3Glasses [13]

    Compared to Oculus Rift, 3Glasses exhibits a large visual angle (110°) and a delay of less than 13 ms. Its resolution is as high as 2560 × 1440, satisfying the requirement of the mainstream VR headsets. With respect to the operation, plug and play is supported, and an intelligent light sensor is incorporated to adapt to variations in screen brightness. Moreover, the system incorporates a gyroscope mechanism, albeit without a space-tracking function. Therefore, users are not allowed to move freely to interact in fields through 3Glasses.

  3. HTC Vive [14, 15]

    HTC Vive is a highly popular VR HMD at present. It is capable of utilizing the room scale technology to set the real-world room environment as a 3D interactive virtual space in which users are allowed to move freely and use a handheld controller with the motion tracking function to manipulate 3D virtual objects. The whole system is equipped with more than 70 sensors, which include the gyroscope, accelerometer, and laser position sensor, for positioning and sensing purposes. The spatial positioning technology of HTC Vive, which is dissimilar to the image collection approach of Oculus Rift, calculates the light emitted from the lighthouse and uses the sensor in the Vive system to receive the light information. Therefore, it can efficiently calculate the position of a user and the handheld controller in the space. Oculus Rift uses a camera to detect a user’s position, which is also prone to space limitation. For example, when a user shifts away by a significant distance, it is not feasible to identify the user’s position, owing to the unclear image. In contrast, the Lighthouse used by HTC Vive can irradiate to a larger range (up to 15 × 15 feet of space tracking). This implies that HTC Vive incorporates a space-tracking mechanism superior to that of Oculus Rift.

Figure 1.

Oculus rift, 3Glasses, and HTC vive.

1.2. Arduino somatosensory interaction

Our routine life involves various interactive devices such as air conditioner thermostats and car-reversing radars. The air conditioner thermostat uses a sensor to detect the ambient temperature for automatically adjusting the indoor temperature. The car-reversing radar uses a sensor to detect the distance from an object located behind the car; when the car is reversed excessively close to the object, a tone is produced to warn the driver. In the past, it was challenging to develop such electronic devices; however, the present use of Arduino technology substantially reduces the threshold for development.

Arduino is an authorized interactive environment development technology, wherein both the software (open source) and hardware are open. The software development environment can be downloaded free of charge from the Arduino website; this implies that the software development process is straightforward and is supported by numerous references. In the traditional hardware environment, developers of a microcontroller are generally required to have a background in electronics, electrical machinery, or related fields. For common personnel, it is time-consuming to gain a complete understanding of the development environment. The threshold for learning Arduino is relatively more straightforward: personnel without a background in electronics, electrical machinery, or related fields can conveniently learn to develop Arduino-based interactive devices. As Arduino is based on public sharing, most developers share their creative works, and a number of creative projects are available on the Internet, enabling developers to complete their creative works in a significantly short time.

The Arduino development system mainly includes Arduino development boards, Arduino development software, and Arduino extended components, as shown in Figure 2. They are briefly introduced as follows:

  1. Arduino development board

    The Arduino development board is the motherboard for development of systems such as Uno, Leonardo, Due, Mega, and Nano. Consider the Arduino Uno development board as an example. It includes an ATmega328 microcontroller chip, a USB port, and an input/output pin. The ATmega328 microcontroller chip is the main processing chip. The USB port is a port through which the power supply is provided, and the developed PC program is uploaded to the development board. The input/output pin can be connected to the required sensor. Owing to its affordability and convenient development, Arduino Uno is one of the most popular development boards.

  2. Arduino development software

    The Arduino Integrated Development Environment (IDE), based on AVR-GCC and a few open-source software, supports Java and C/C++ programming languages. This allows developers to get started conveniently. The IDE software is available for free download on the Arduino website and can be used immediately after the downloaded package is decompressed. This IDE can run on Windows, Mac OS X, and Linux. On the main interface of Arduino IDE, the white area in the middle is used for program editing, whereas the black area at the bottom is used to provide information tips. The main programs include both the setup () and loop () functions. The setup () function is executed only one time, when the program starts. Therefore, the code written in the setup () function is generally responsible for initialization. The loop () function is executed repeatedly and is also the main programming area. After the programming is complete, the program can be compiled. If the compiling process is successful, the program can be uploaded to the development board. The sensors or indicators connected to the development board reflect the results of program execution.

  3. Arduino extended components

    Arduino extended components refer to the extended boards (shields) or sensors connected to the development board. The main function of a shield is to reinforce the specific requirements of the original development board. For example, if Arduino is expected to exhibit the capability to connect to the WiFi network, WiFi Shield can be directly superposed on the Arduino Uno. In a similar manner, Ethernet Shield, GSM Shield, and Motor Shield can satisfy various expansion requirements. The Arduino development board is equivalent to the motherboard of the entire Arduino system. Sensors such as the temperature sensor, three-axis accelerometer, gyroscope, power sensor, bending sensor, and button sensor are the key equipment of the Arduino system for sensing environmental variations. Through these sensors, the Arduino development board can read information associated with the environment or objects, such as the ambient temperature, object movement or rotation, force endured, and bending degree.

Figure 2.

Arduino development board and its components.

Advertisement

2. Related works

2.1. VR headset-related work

VR headset has become the most recent development trend of the information and communication technology (ICT) industry. At present, an increasing number of experts and researchers have been assigned to the relevant studies and applications, including medical care and entertainment. The following sections provide a brief introduction to relevant applications:

2.1.1. Medical care

Juanes et al. [16] used the Oculus Rift headset to construct a 3D virtual environment vision system with stereoscopic imaging effect for hospital operating rooms. This system provides users with an immersive experience and enables learners to become familiar with the operation of various devices and medical monitoring systems in operating rooms to provide more effective practical training. To achieve this purpose, Juanes et al. used Maya to construct 3D models for the relevant equipment in operating rooms, Oculus Rift SDK and Unity3D to construct control and display mechanisms, and Oculus Rift to present relevant information, as shown in Figure 3.

Figure 3.

VR application in medical care.

2.1.2. Entertainment

In order to understand the impact of the headset on the gamer, Tan et al. [17] took Half-Life 2 first-person shooter as the experimental environment to explore the experience of ten players (P1–P10) as a result of wearing the Oculus Rift headset. Before the experiment, Tan et al. analyzed the fundamental information about the ten players, including their age, gender, game interests, gameplay habit, and whether they played Half-Life before. In the experiment, Tan et al. compared the Oculus Rift and the desktop computer screen modes and used the questionnaire to analyze the variations in the immersive experiences among the ten players in the above two game modes, as illustrated in Figure 10. The questionnaire analysis results demonstrate that the average score for immersive experience of Half-Life through the traditional desktop computer screen was 142.5 and that through Oculus Rift was 147.2. This implies that Oculus Rift produces a better immersive experience than the traditional desktop computer screen, as shown in Figure 4.

Figure 4.

VR application in entertainment.

2.2. Arduino somatosensory interaction

Arduino uses small-size development boards as the implementation platforms and connects to various sensors through development boards such that the users can interact with live objects in the real world. Arduino has a wide variety of applications, including medical care, 3D virtual environmental interaction, and wearable devices. These applications are briefly introduced in the following sections.

2.2.1. Medical care

Dasios et al. [18] designed a remote monitoring system for elderly care by using the Arduino device. Installed in the elderly’s home, this system has Arduino’s detection nodes available at routinely used locations such as the bedroom, living room, bathroom, and dining table. It also incorporates a wearable node on the elderly’s body to detect the regional temperature, humidity, light intensity, and the elderly’s activity status. The system uses the coordinator node to transmit the data of each detection node to the server through the Internet; these data are recorded in the server’s database. The caretakers can understand the elderly’s living environment and activity status directly through web pages. This enables the caretakers to expeditiously identify an abnormal situation and further undertake related care activity, as shown in Figure 5.

Figure 5.

Arduino application in medical care.

2.2.2. 3D virtual environment interaction

Wang and Yu [19] proposed a project called Virtual Spine in 2013. The main purpose of this project is to help common personnel avoid incorrect or overly fixed sitting postures over long periods and prevent the endangering of physical health. The 3D Virtual Spine interactive system developed in this project is divided into the Sedentary-Sensory Unit, Advisory Unit, Interactive 3D Unit, and Messaging Unit.

In the Sedentary-Sensory Unit, Wang and Yu installed Arduino pressure sensors on the seat to collect pressure data, which is transmitted to the Advisory Unit and Interactive 3D Unit to calculate the pressure burden accumulated on the spine and perform 3D rendering and interaction. The red area in the virtual seat indicates an area acted upon by a relatively larger pressure; therefore, it can be observed that two red circular areas with larger pressure are present in the 3D virtual spine. The Advisory Unit uses the Messaging Unit to transmit the comprehensive information instantly to users through media such as mobile devices, computer screens, or smart TVs to prevent physical injuries because of long periods of sitting or incorrect sitting postures, as shown in Figure 6.

Figure 6.

Arduino application in 3D virtual environment interaction.

2.2.3. Wearable device

Sugathan et al. [20] developed a wearable health monitoring system in 2013, as shown in Figure 7, which primarily used Arduino and biosensors in clothing. When a patient wears this system, sensors to detect the patient’s heartbeat frequency, electronic skin activity, and noninvasive skin temperature are started. By integrating these devices, we can instantly understand the overall situation of the patient. Such patients generally fall down due to heart disease, resulting in an alteration in the lifestyle or physical failure. This system can instantly detect variations in the patient’s body and thus help avoid more serious illness.

Figure 7.

Arduino application in wearable devices.

Advertisement

3. System design

Although Arduino has been used in numerous applications, the main applications involve the use of sensors to detect the environment or the user’s physical condition, such as in medical care, sports, and clothing-like wearable devices. 3D interactive applications have rarely been developed. In this study, VR technology was integrated with the Arduino somatosensory interaction mechanism, and the theme of baseball game was used to develop a “wearable VR somatosensory interactive game system,” as shown in Figure 8.

Figure 8.

Wearable VR somatosensory interactive game system.

The interaction architecture of this system generally includes the batting-sensing module, batting-information transmission module, batting-information receiving module, game control reaction module, and VR display module, as shown in Figure 9.

  • Batting sensing module:

Figure 9.

System architecture.

The main function of this module is to use the MPU-6500 six-axis sensor to detect the batting data.

  • Batting information transmission module:

The main function of this module is to use the NRF24L01 communication device to transmit the batting data to the computer.

  • Batting information receiving module:

This module corresponds to the batting information transmission module. Therefore, this module also uses the NRF24L01 communication device to receive the batting data from the batting information transmission module.

  • Game control reaction module:

This module, deployed in the Unity3D software development environment, reads the data from the batting information receiving module to examine whether the bat has been swung and generates corresponding control reactions in the game system.

  • VR display module:

This module is developed mainly for the required VR devices. The VR devices developed for this system are Oculus Rift and HTC Vive. Therefore, this system supports the VR display of Oculus Rift and HTC Vive.

In view of the above interactive architecture, this study divided the implementation of the system into two parts: one is the design of the hardware VR bat, and the other is the development of the VR baseball game. The former is completed using the Arduino technology, whereas the latter is developed mainly using the Unity3D environment.

3.1. Hardware VR bat design

The system is designed with two sets of Arduino to detect the transmission and reception of batting information: one is located on the bat (transmission end), and the other is located on the computer (receiving end). The baseball game system is also installed on the computer to simultaneously reflect the batting actions in the game based on the information received by Arduino. Users can watch 3D VR scenes of the VR baseball game system via Oculus Rift or HTC Vive for an immersive experience. In order to complete the Arduino bat, we installed an InvenSense MPU-6500 six-axis sensor and NRF24L01 high-power wireless receiving device on the Arduino Nano development board. These devices, as shown in Figure 10, are introduced as follows:

  • Arduino Nano:

Figure 10.

Arduino Nana, MPU-6500, and NRF24L01.

In the design of Arduino Nano, the DC power interface is replaced with the Mini-B USB interface to connect the computer. Apart from the alteration in the appearance, all other interfaces and functions remain identical. The controller also uses ATmegal 68 or ATmega328.

  • InvenSense MPU-6500:

The InvenSense MPU-6500 replaces the older MPU-6050. The new MPU-6500 incorporates a gyroscope and accelerometer to reduce significant installation space. Compared to the older MPU-6050, the new MPU-6500 offers an SPI protocol and reduces the energy consumption of action sensing and tracking by ~60%. InvenSense’s motion processing database can process complex data of motion sensing, reduce the burden of motion processing operations on the operating system, and provide a structured API for application development.

  • NRF24L01:

NRF24L01 is a wireless receiving device that runs in the 2.4–2.5 GHz universal ISM band. The wireless transceiver includes the frequency generator, enhanced equipment, SchockBurstTM, mode controller, power amplifier, crystal amplifier, modulator, and demodulator. The output power channel selection and protocol settings can be set through the SPI to achieve significantly low current consumption. For example, the current consumptions are 9.0 mA and 12.3 mA in the transmission and receiving modes, respectively, and the consumption is the lowest in the standby mode.

This system primarily uses the Arduino IDE as the firmware inside the Arduino bat. Because the system must determine whether a user has swung the bat, the Arduino bat in this system is required to read the bat rotation and acceleration data detected by the MPU-6500 and transmit it to the computer; moreover, the receiver must be capable of receiving the data to determine whether the bat has been swung.

3.2. VR baseball game development

In addition to 3D baseball scene production, the following modules were developed for the VR baseball game system on the computer, involving the pitcher, batting, and score counter:

  • Pitcher:

This module includes ball speed control and pitch-type control. The ball speed is generated randomly, whereas there are mainly fastball and curveball. According to the various pitch types, this module automatically generates the motion trajectory of the Bezier curve. To make the pitching more convenient, this system also marked the strike. When the baseball: following the trajectory: finally arrived at the strike area, either a strike or bad ball occurs.

  • Batting:

This module consists of the game-side Arduino data reading module, batting control module, and bat collision detection module. The main function of the game-side Arduino data reading module is to read the data of bat rotation and acceleration in the Arduino at the receiver end and send the data to the batting control module to determine whether it is necessary to swing the bat. Upon swinging, this module calls the bat collision detection module to determine whether the bat collides with the baseball. As the game-side Arduino data control module is the main function module that connects the Arduino device with the Unity game system, this module is developed with the Arduino for Unity kit.

  • Score counter:

The system scores points by determining whether a home run is hit. Therefore, the system calculates the movement distance of each baseball after it is hit. When the distance exceeds the warning track, the system scores one point. To display the score results, the system incorporates a scoreboard in the baseball field to display the number of home runs and the distance to which the baseball is hit.

Advertisement

4. Implementation results

The test environment is Intel Core i7-6700HQ CPU 2.6 GHz, 8 GB memory, Nvidia Geforce GTX 960 M, and Windows 10. However, in order to achieve an enhanced performance experience, this study recommends that the performance level of the display card be at least that of Nvidia Geforce GTX 980. The function implementation and display results are as follows:

  • Hardware VR bat implementation results and its firmware development:

In this study, Arduino Nano, InvenSense MPU-6500, and NRF24L01 were integrated, as illustrated in Figure 11. On the receiving end, Arduino Nano and NRF24L01 were combined. The receiving end can receive bat information immediately after it is connected to the USB port of the computer.

Figure 11.

VR bat implementation results.

With regard to the development of the firmware on the transmitter, the system uses the accelgyro to read the acceleration values ax, ay, and az and the gyroscope values gx, gy, and gz on the x-, y-, and z-axes of the MPU-6500 and uses the Serial.write () instruction to transmit the data through the NRF24L01. On the receiver, this system uses the Mirf to receive the data transmitted by the NRF24L01.

  • VR baseball game development:

When the game starts, the virtual pitcher in the game automatically detects whether the player faces the pitcher. When the player wearing Oculus Rift or HTC Vive faces the pitcher, the pitcher starts. Before the virtual pitcher throws a baseball, the system randomly sets the destination in the vicinity of the strike area for the ball to touch base and calculates the required motion trajectory. When the ball is thrown, it advances along the specified trajectory at the specified speed. The red arrow in Figure 12 refers to the trajectory of the ball, and the red circle marks the ball moving on the trajectory.

Figure 12.

The strike area and ball motion trajectory.

When the ball is thrown, the player is free to determine the batting time. When the virtual bat in the system collides with the ball, the ball begins to detect when to touch ground. When the ball touches the ground, the system calculates its motion distance. If the distance exceeds the warning track for home runs, one home run is accumulated. Each game has ten times of pitching, as shown in Figure 13.

  • VR display module:

Figure 13.

Counting the number of home runs and calculating their motion distance.

In this system, two VR display modules were developed for Oculus Rift and HTC Vive, respectively. For Oculus Rift, this system introduced the Oculus SDK for Unity kit. Through this kit, a TrackCamera was developed to track the images for both the eyes, required by the Oculus Rift display. For HTC Vive, this system imported a SteamVR for Unity and implemented a SteamVR_Camera to integrate SteamVR such that the system’s image information can be displayed on the HTC Vive headset, as shown in Figure 14. The player’s use of the system is illustrated in Figure 15.

Figure 14.

Oculus SDK for Unity kit and SteamVR for Unity.

Figure 15.

The player’s use of the system.

In this study, the Unity Profiler was also used to detect the implementation performance of this system. When executing this system, the CPU running speed was maintained at 60 FPS or higher, whereas most GPUs achieved a speed of ~100 FPS. This implies that this system can achieve the timeliness for interactive implementation of the game (30 FPS or higher), as shown in (Figure 16).

Figure 16.

The performance of our system.

Advertisement

5. Conclusion

This study focused on the combination of Arduino and VR technology and used the interactive design characteristics of Arduino to provide a more intuitive manipulation mode for VR games. The experimental results demonstrated that Arduino sensing devices are of a wide variety and convenient to be combined with VR games and provide adequate execution performance. Therefore, it is likely that in the future, Arduino will be capable of providing more varied creative studies and combinations for VR developers. In addition, it is noteworthy that Arduino is the main development technology of the Internet of Things (IoT), and IoT and VR are both the development trends of information technologies in the next 10 years. Therefore, this study focused only on the VR games. The aforementioned technology projection also implies additional opportunities to further integrate these two trends. Moreover, this is highly likely to serve as a flourishing creation space for studies on these two trends.

Advertisement

Acknowledgments

We would like to thank Axis 3D Technology, Inc., for industry cooperation. This work was supported by the Ministry of Science and Technology, Taiwan, under contract no. MOST 105-2622-E-159-001-CC3 and MOST 106-2622-E-159-002-CC3. Additionally, we would also like to thank the support of Regional Industry-University Cooperation Center, Ministry of Education, Taiwan.

References

  1. 1. Tseng JL. Development of a low-cost 3D interactive VR system using SBS 3D display, VR headset and finger posture motion tracking. International Journal of Advanced Studies in Computer Science and Engineering. 2016;5(8):6-12
  2. 2. Xu S, Lyu W, Li H. Optimizing coverage of 3D wireless multimedia sensor networks by means of deploying redundant sensors. International Journal of advanced studies in Computer Science and Engineering. 2015;4(9):28-33
  3. 3. Tseng JL. An improved surface simplification method for facial expression animation based on homogeneous coordinate transformation matrix and maximum shape operator. Mathematical Problems in Engineering. 2016;2016. Article ID: 2370919. DOI: 10.1155/2016/2370919
  4. 4. Tseng JL. Surface simplification of 3D animation models using robust homogeneous coordinate transformation. Journal of Applied Mathematics. 2014;2014. Article ID: 189241. DOI: 10.1155/2014/189241
  5. 5. Karunaratne S, Yan H. 3D animated movie actor training using fuzzy logic. In: IEEE Computer Graphics International. 2001. p. 23-30. DOI: 10.1109/CGI.2001.934654
  6. 6. Bangquan L, Yun M. A facial animation based on emotional model for characters in 3D games. In: IEEE International Conference on Computer Science and Information Processing. 2012. p. 1304-1307. DOI: 10.1109/CSIP.2012.6309101
  7. 7. Janarthanan V. Innovations in art and production: Sound, modeling and animation. In: IEEE Ninth International Conference on Information Technology: New Generations. 2012. p. 879-882. DOI: 10.1109/ITNG.2012.80
  8. 8. Aoki M, Koning W, Miyai A, Kamihira T. 3D animation education in the US and Japan: Different environments, similar issues. In: ACM SIGGRAPH Asia Sketches. Article No. 34. 2011. DOI: 10.1145/2077378.2077421
  9. 9. Laver KE, George S, Thomas S, Deutsch JE, Crotty M. Virtual Reality for Stroke Rehabilitation [Internet]. 2015. Available from: http://www.cochrane.org/CD008349/STROKE_virtual-reality-for-stroke-rehabilitation
  10. 10. Cheng LP, Roumen T, Rantzsch H, Köhler S, Schmidt P, Kovacs R, Jasper J, Kemper J, Baudisch P. TurkDeck: Physical virtual reality based on people. In: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. 2015. p. 417-426. DOI: 10.1145/2807442.2807463
  11. 11. Gordon R. Oculus Rift Will be Cheaper Thanks to Facebook; Aiming for 2015 Release [Internet]. [Updated]: 2014. Available from: https://gamerant.com/oculus-rift-cheaper-after-facebook-buyout/
  12. 12. Wikipedia. Oculus Rift [Internet]. [Updated]: 2017. Available from: https://en.wikipedia.org/wiki/Oculus_Rift
  13. 13. Chang CM, Hsu CH, Hsu CF, Chen KT. Performance measurements of virtual reality systems: Quantifying the timing and positioning accuracy. In: Proceedings of the ACM on Multimedia Conference. 2016. p. 655-659. DOI: 10.1145/2964284.2967303
  14. 14. Soffel F, Zank M, Kunz A. Postural stability analysis in virtual reality using the HTC vive. In: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology. 2016. p. 351-352. DOI: 10.1145/2993369.2996341
  15. 15. McGhee J, Bailey B, Parton RG, Ariotti N, Johnston A. Journey to the centre of the cell (JTCC): A 3D VR experience derived from migratory breast cancer cell image data. In: SIGGRAPH ASIA 2016 VR Showcase. 2016 Article No. 11. DOI: 10.1145/2996376.2996385
  16. 16. Juanes JA, Gómez JJ, Peguero PD, Lagándara JG, Ruisoto P. Analysis of the oculus rift device as a technological resource in medical training through clinical practice. In: ACM Proceedings of the 3rd International Conference on Technological Ecosystems for Enhancing Multiculturality. 2015. p. 19-23. DOI: 10.1145/2808580.2808584
  17. 17. Tan CT, Leong TW, Shen S, Dubravs C, Si C. Exploring gameplay experiences on the oculus rift. In: ACM Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play. 2015. p. 253-263. DOI: 10.1145/2793107.2793117
  18. 18. Dasios A, Gavalas D, Pantziou G, Konstantopoulos C. Wireless sensor network deployment for remote elderly care monitoring. In: Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments. 2015. DOI: 10.1145/2769493.2769539
  19. 19. Wang SJ, Virtual-spine YD. The collaboration between pervasive environment based simulator, game engine (mixed-reality) and pervasive messaging. In: Proceedings of the 7th International Conference on Pervasive Computing Technologies for Healthcare. 2015. p. 45-48. DOI: 10.4108/icst.pervasivehealth.2013.252108
  20. 20. Sugathan A, Roy GG, Kirthyvijay GJ, Thomson J. Application of Arduino based platform for wearable health monitoring system. In: 2013 IEEE 1st International Conference on Condition Assessment Techniques in Electrical Systems. 2013. p. 1-5. DOI: 10.1109/CATCON.2013.6737464

Written By

Juin-Ling Tseng and Chia-Wei Chu

Submitted: 14 May 2017 Reviewed: 15 September 2017 Published: 20 December 2017