Open access peer-reviewed chapter

A Wearable Force-Feedback Mechanism for Immersive Free-Range Haptic Experience

Written By

Peter Kudry and Michael Cohen

Submitted: 29 June 2023 Reviewed: 17 July 2023 Published: 29 September 2023

DOI: 10.5772/intechopen.1002679

From the Edited Volume

Applications of Augmented Reality - Current State of the Art

Pierre Boulanger

Chapter metrics overview

74 Chapter Downloads

View Full Metrics

Abstract

This chapter presents the development of a wearable force-feedback mechanism designed to provide a free-range haptic experience within the spectrum of Extended Reality (XR). The proposed system offers untethered six degrees-of-freedom and small- to medium-scale force-feedback, enabling users to immerse themselves in haptic interactions within virtual environments. The hardware comprises a modified 3D Systems Touch haptic device, driven by software that allows for ambulatory exploration of various haptic aspects. Two experiments were conducted to evaluate the precision, ergonomics, stability, usability, user experience, and performance of the system. Despite indication of software and hardware deficiencies, the results highlight the potential of combining haptic force-feedback and ambulatory XR to enhance immersion in free-range virtual environments. Furthermore, the integration of Mixed Reality pass-through enables users to seamlessly merge real-world environments with augmenting virtual elements. This extension contributes to the exploration of new possibilities for immersive and interactive experiences within mixed reality applications. Future research can delve deeper into the prototype’s potential, further unlocking opportunities for haptic-enabled ambulatory XR experiences and pushing the boundaries of immersive technologies.

Keywords

  • haptic interface
  • wearable computer
  • virtual reality
  • augmented reality
  • mixed reality
  • human-computer interaction (HCI)
  • ambulatory application
  • force-feedback
  • perceptual overlay
  • tangible user interface (TUI)
  • spatial computing
  • multimodal interaction
  • extended reality (XR)
  • mixed reality (MR)
  • virtual environments (VE)

1. Introduction

In this section, we survey the evolution of extended reality (XR), discuss its present trends, highlight areas that are comparatively less developed within this paradigm, emphasize the significance of further advancements, and provide an overview of the alternate solution we have developed through our research.

1.1 History of XR

Origins of contemporary virtual reality (VR) can be traced back to the early twentieth century with “Link Trainer,” the first flight simulator, developed in 1929 and patented in 1931 [1]. However, VR hardware primarily targeted commercial use, and early attempts to bring this technology to the general public for entertainment purposes encountered commercial failures. Disappointments include the Sega VR-1 and Nintendo Virtual Boy, released in 1994 and 1995, respectively. This situation has significantly changed in the recent years, as marked by the release of the Oculus Rift in 2010, which sparked a renaissance in VR.

Similarly, in the augmented reality (AR) domain, there were already several established AR solutions by 2010, but the ARToolKit, developed in 2000 [2], can be considered the precursor to modern AR. By 2020, use of VR and AR, collectively referred to as extended reality (XR), expanded across various domains, including entertainment, education, and industrial training. XR technologies have become widely accessible to the general public through integration with popular gaming consoles, smartphones, and stand-alone headsets, offering immersive experiences at a fraction of the cost compared to earlier VR waves [3, 4]. Such rapid technological advancements have exponentiated numerous trends within a relatively short period.

1.2 Ongoing and emerging trends in XR

In recent years, integration of VR and XR across various industries, including marketing, design, and retail, has caused a notable shift in dominance within specific demographic segments [5]. This shift is evident in the current landscape of XR experiences, which predominantly cater to industrial applications rather than gaming and entertainment. The 2020 XR Industry Insight report reveals that about 23 of AR development companies are focused on industrial applications, while consumer products account for only about 13 [6]. This trend can be attributed to the unique ability of XR technologies to virtualize environments and scenarios that are typically expensive or hazardous.

The healthcare industry has embraced XR technologies as utilized in therapeutic interventions for phobias and anxiety disorders, as well as in assisting individuals with autism in developing social and communication skills. AR, in particular, holds immense potential as a valuable visual aid for surgical procedures.

These ongoing and future trends in XR are propelled by advancements in hardware. XR devices are continuously shrinking in size, becoming more portable, and gaining enhanced processing power, thereby improving accessibility and ergonomics. The shift toward self-contained and untethered devices, exemplified by products like Meta Quest, contribute to an enhanced user experience [7]. Another significant advancement is the widespread deployment of 5G mobile networks, offering the potential for improved VR streaming and seamless collaborations within virtual environment (VE) by delivering higher bandwidth and lower latency.

In summary, the application of VR and XR has resulted in a transformative shift. Industrial applications currently dominate XR experiences, with healthcare serving as a prominent example of XR technology adoption. Ongoing advancements in hardware, including smaller, more powerful, and self-contained devices like the Apple Vision Pro, and the promise of mobile broadband connectivity, contribute to the continuous growth and development of XR applications.

1.3 Importance of haptic feedback in immersive environments

Significant advancements have occurred in haptics for VR since the release of the first Oculus Rift development kit a decade ago. One notable improvement is the transition from localized, three degrees-of-freedom models to room-scale, six degrees-of-freedom tracking. This advancement allows complete tracking of user movement, including wireless tracking of hand-held controllers. Modern controllers now include capacitive touch sensors that detect finger movements, enhancing user interaction with virtual environments. The tactile representation of buttons on controllers reinforces immersion by providing coherent sensory experience. However, in terms of haptic feedback from virtual objects, progress has been limited, as most controllers still rely on vibration feedback. Although solutions such as gloves and full-body suits provide more realistic haptic feedback, they are not widely accessible due to their cost and specialized markets [8, 9].

Sense of presence, the subjective perception of truly being in a virtual environment, is a crucial aspect of immersive VR experiences [10, 11]. While realistic visual environments can can foster suspension of disbelief, lack of coherent haptic feedback when interacting with the VE disrupts the illusion. Passive haptics, which involve using physical props placed in the real space, can partially address this issue. For example, placing a box in the real space allows a user to lean on it in the game, creating the haptic illusion of taking cover. However, this approach is impractical as it requires adapting the real space to match the virtual environment (VE). Another solution, known as active haptics, utilizes robotics to dynamically position props based on user actions. Although this approach overcomes the flexibility issue, it may introduce latency, which can be problematic in fast-paced gaming scenarios [12, 13].

Despite the challenges, several studies have shown that haptic feedback improves interaction, spatial guidance, learning, and sense of presence in VR environments [9, 14, 15]. However, current systems still struggle to provide high-quality force or tactile feedback [16]. Achieving relevant and believable combination of small-scale haptic feedback, such as texture simulation, and medium-scale force-feedback, poses challenges in terms of form-factor, ease of deployment, and precision [17]. These challenges are particularly evident in wearable solutions that aim to immerse users in a virtual environment through VR. In conclusion, haptic feedback, along with the stimulation of other senses beyond visual and auditory, significantly impacts quality of immersion in VR. The greater the variety of sensory feedback, the deeper the immersion [18, 19].

1.4 Recent haptic feedback solutions for immersive experiences

Despite challenges in implementing haptic feedback devices, researchers are actively working on solutions to overcome these obstacles and drive advancements in this field. This section provides a brief description of various devices that have recently addressed some of the aforementioned issues.

1.4.1 Cross-field haptics: push-pull haptics combined with magnetic and electrostatic fields (2015)

The concept of cross-field haptics involves the use of multifield physical quantities to replicate various textures. One approach is to utilize magnetorheological fluid (MRF) positioned between an array of electromagnets and conductive electrodes. The behavior of MRF is influenced by the magnetic field generated by these layers. When no magnetic field effects are present, MRF behaves as a Newtonian fluid, but its viscosity changes in response to variations in a field, transforming it into a non-Newtonian fluid. In simpler terms, the viscosity of the MRF can be adjusted based on the strength of a magnetic field produced by coils, allowing for simulation of various textures [20].

1.4.2 Magic table: deformable props using Visuo haptic redirection (2017)

The Magic Table employs haptic retargeting using a single physical object to represent multiple objects in a VE [21]. Specifically, it utilizes body warping and the technique of redirected walking. Body warping refers to perceived change in the shape of objects, and redirected walking introduces additional translations and rotations to a head-mounted display (HMD), causing users to physically traverse paths that differ from their virtual perception.

1.4.3 HaptoBend: utilizing shape-change to enhance virtual reality (2017)

HaptoBend is a passive haptic feedback device that enables users to experience the transformation of a flat 2D object into a multi-sided 3D object through bending [22]. It uses four ridged sections with hinged connections, allowing users to deform it into preferred physical handform shapes and interact with virtual objects in Unity for use in VR.

1.4.4 AirPiano: enhancing music playing experience in virtual reality with mid-air haptic feedback (2017)

AirPiano is a unique haptic feedback device that replicates the experience of playing a piano by creating touchable keys in free space using ultrasonic vibrations [23]. While this specific simulation is not generalizable, it demonstrates the potential of using ultrasonic acoustic waves for haptic feedback in specialized applications.

1.4.5 CLAW: a multifunctional handheld VR haptic controller (2018)

CLAW is a virtual reality controller that enhances traditional controller capabilities by providing force-feedback and actuated movement specifically to the index finger [24]. It aims to provide feedback for three different types of interactions: touching, grasping, and triggering. Vibrations are used to simulate textures when touching, and a servomotor is employed for the grasping and triggering actions.

1.4.6 Haptic revolver: touch, shear, texture, and shape rendering on a VR controller (2018)

The Haptic Revolver shares similarities to the CLAW in terms of features but employs a different approach to achieve the same functionality. It utilizes an actuated wheel that moves up and down beneath the user’s finger to create contact with a virtual surface [25]. As a user’s finger glides along a surface, the wheel spins to generate shear forces and motion feedback. Unlike the CLAW, the Haptic Revolver offers the advantage of user-interchangeable wheels, which can provide various textures, shapes, edges, and active elements to enhance haptic experience.

1.4.7 Wearable fingertip haptic device for remote palpation: Characterization and interface with a virtual environment (2018)

The wearable fingertip haptic device consists of two primary subsystems that enable simulation of palpation (feeling a shape) of virtual object presence and surface stiffness [26]. The first subsystem includes an inertial measurement unit, which tracks one’s finger’s motion and adjusts the linear displacement of a pad toward the fingertip. The second subsystem controls the pressure in a variable compliance platform using a motorized syringe, allowing for simulation of surface stiffness when touched.

1.4.8 Interactive sculpting using augmented-reality, mesh morphing, and force-feedback: Force-feedback capabilities in an augmented reality environment (2018)

This interactive sculpting prototype integrates force-feedback capabilities with AR to facilitate realtime morphing of geometric surfaces. Its purpose is to provide designers with a direct way to modify component shapes through interaction with virtual representations [27]. The system utilizes a radial basis function (RBF) morpher for realtime computations. It incorporates a camera as an input device and an HMD with OLED screens as an output, effectively creating an AR helmet. The Geomagic Touch X device functions as a haptic interface, enabling users to manipulate virtual objects and experience force-feedback along three spatial directions using its motors.

1.4.9 Muscleblazer: force-feedback suit for immersive experience (2019)

The Muscleblazer suit is a lightweight exo-suit designed for force-feedback in VR [28]. It utilizes solenoid valves and micro-controller boards to activate Pneumatic Gel Muscles (PGMs) and generate flexible and lightweight forces. In conjunction with a VR game, users can engage in shooting enemies using an HTC VIVE controller, the PGMs providing haptic display upon being shot. The suit is adaptable to both VR and AR environments, enabling wireless communication for the generation of force-feedback effects.

1.4.10 Wireality: enabling complex tangible geometries in virtual reality with worn multi-string haptics (2020)

Wireality aims to overcome various challenges in haptic feedback technology. It is a self-contained wearable system that enables precise positioning of individual hand joints in three-dimensional space, using retractable wires that can be programmatically locked [29]. This allows for realistic interactions with intricate geometries, such as wrapping fingers around objects such as railings. The device is lightweight, comfortable, and durable, while also being affordable with production cost less than $50 USD.

1.4.11 Wrist-worn prototypes for XR input and haptics by meta (2021)

Facebook is actively exploring haptic technology through wrist-worn input devices. These devices are still in the early prototype stage, but Facebook envisions using electromyography (EMG) sensors to track button presses and enable keyboard-less typing by sensing electrical signals in a user’s arm. Previous prototypes from Facebook have included wristbands with inflatable bladders to apply pressure on one’s wrist and vibrating actuators for vibro-tactile feedback. Facebook’s heavyweight involvement in haptics suggests significant advancements in the field may be on the horizon [30].

1.4.12 QuadStretch: a forearm-wearable multi-dimensional skin stretch display for immersive VR haptic feedback (2022)

QuadStretch is a newly developed forearm-worn device designed for VR interaction. It features a compact and lightweight design and utilizes counter-stretching to stretch a user’s skin [31]. This is achieved by moving in opposite direction a pair of tactors, secured to one’s skin surface with an elastic band. QuadStretch offers various VR interaction scenarios that showcase its unique characteristics, including intensity-based activities such as boxing and pistol shooting, passive tension and spatial multi-dimensionality in activities like archery and slingshot, and continuity in complex movements like flying and climbing.

1.4.13 Free-range haptic immersive XR (2022)

While most solutions in this section do not provide unrestricted mobility and positional displacement for haptic interfaces with force-feedback in virtual environments, our own project addresses this limitation. We have developed hardware by modifying the 3D Systems Touch haptic device, integrated with software using the Unity game engine and low-level device drivers [32]. This combination enables medium-scale force display in mobile XR applications, allowing enhanced interaction and mobility within virtual environments.

Advertisement

2. Materials and methods

2.1 Problem description

In Ref. [16], a paradigm shift in human-computer interaction was described, highlighting emergence of devices like the force-feedback haptic stylus. The 3D Systems Touch haptic devices are particularly relevant, offering precise tracking and possible support among major 3D design tools such as Blender, Maya, and 3DS Max. These stylus devices, designed for desktop use, are ideal for immersive CAD experiences due to their accurate force-feedback and high precision.

While there are other wearable force-feedback solutions available, such as the TactGlove [33], SenseGlove Nova [34], and Power Glove [35], stylus-based interfaces have the advantage of being grounded and providing force-feedback in space. However, they impose constraints on the user’s arm movement as the stylus interacts with virtual surfaces. This limitation does not apply to other devices that offer the ability to sense size, shape, stiffness, and motion of virtual objects, but lack positional translation.

Wireality (§1.4.10), a device comparable in functionality to the Touch stylus, was released in 2020. While Wireality effectively addresses many challenges outlined regarding the importance of haptic feedback, its precision may not meet the requirements of 3D modeling and sculpting, and it does not support texture simulation or vibrotactile feedback.

2.2 Prototype design overview

Hence, we took on a mission of transforming a stylus device designed for desktop use into a wearable haptic interface capable of delivering untethered six degrees-of-freedom in a VR setting. This endeavor spawned various challenges, including addressing the ergonomics of a wearable harness, managing weight and power considerations, and overcoming limitations inherent to the haptic device itself [32, 36].

2.2.1 Hardware

Harness—Upon careful consideration and subsequent dismissal of over-the-shoulder arrangements, the decision was made to develop an adjustable platform where the stylus base could be mounted in front of the user. The key component is a user-worn vest, which must possess sufficient strength to bear the weight of all the hardware without compromising comfort or impeding natural movement. A tactical vest initially designed for survival games emerged as a versatile choice. To accommodate the necessary modifications, mounting clips for two sheets of 3-mm-thick aluminum were added to both the front and back sections of the vest, as shown in Figure 1.

Figure 1.

Tactical vest customized for supporting force-feedback device platform: Strap positions, front (A) and back (B).

Front and rear plating—Various alternatives for the plating material, such as acrylic or thinner steel, were considered. However, these materials presented drawbacks: Acrylic and similar options were found to be lightweight and flexible, while thicker steel plates were heavy and rigid. Neither of these extremes were suitable for the intended use case. In contrast, aluminum has properties that strike a balance between the two, so was deemed to be the appropriate choice.

The aluminum plate on the back serves the purpose of mounting a full-size 15.6'' laptop, which powers the Meta Quest 2 HMD in “Quest Link” mode. This configuration ensures optimal performance, as the Quest 2 is based on the Android mobile OS platform with hardware that imposes performance limitations. Since Open Haptics, the software development kit (SDK) for 3D Systems devices, is not compatible with Android, it is necessary to drive the haptic device separately.

3D Systems Touch platform—A perpendicular attachment is made on the front aluminum plate, utilizing another 3-mm-thick plate as the base for the stylus assembly, which acts as a cantilever for the wearable device. In the middle of this front aluminum plate, a channel is cut along its length, allowing the stylus to be adjusted closer to or further away from the user’s torso. This ergonomic feature accommodates users of varying height and arm length, ensuring a comfortable fit. To ensure electrical insulation between the aluminum and electronic components of the assembly, the stylus device itself is mounted on a sheet of laser-cut 3 mm-thick acrylic.

Stylus assembly modifications—Modifications were also necessary for the stylus assembly specifically tailored to this application. Originally, the base of the Touch stylus contained weights to prevent it from tipping during desktop use. These weights were removed, as the cantilever provides secure enough mounting for the user to perceive no vertical flexing of the shelf and almost no horizontal flex. Further adjustment included moving the Touch main board from the bottom of the stylus’s base assembly to the bottom of the shelf while preserving the adjustment feature. These (warranty-voiding) modifications and platform structure are shown in Figure 2.

Figure 2.

Force-feedback device modifications: (A) device platform; (B) acrylic sheet under the base; (C) control board—bottom view; and (D) inside the modified device.

Power delivery—The wearable assembly comprises three primary devices, necessitating the development of suitable power delivery systems. The power supplies for the HMD and laptop were utilized in original arrangement, as they come equipped with their own internal batteries. However, power for the stylus servomotor is supplied by an external power bank securely attached to the user’s waist. Figure 3 shows an abstract representation of the entire system, while Figure 4 shows actual use.

Figure 3.

Hardware assembly as worn—Side view (profile).

Figure 4.

Hardware assembly as worn—Photo.

2.2.2 Software

The implemented software is divided into two main subsystems, as shown in Figure 5.

Figure 5.

Wireline software architecture.

Virtual reality—The immersive environment is streamed from the laptop, mounted on the rear of the vest, to the HMD. As mentioned earlier, the Meta Quest 2 is a standalone device based on the Android platform. However, due to incompatibility with the Open Haptics SDK, the Quest must be operated in Link mode, which transforms it into a tethered HMD. Unlike some other HMDs, the Quest 2 utilizes inside-out tracking, eliminating need for stationary “lighthouse” sensors in the user’s space, as required by older devices such as the HTC Vive or Oculus Rift.

The VR environment is created in the Unity game engine, leveraging the Oculus XR plugin and the Mixed Reality Toolkit (MRTK). The Oculus XR plugin handles lower-level functionality such as stereoscopic rendering for the HMD, the Quest Link feature (which essentially turns the Quest into a thin client of the PC), and input subsystems that provide controller support and HMD tracking [37]. The MRTK encapsulates these low-level features and extends them with hand-tracking capabilities and other features such as gesture-specified teleportation, ray-cast reticle operation, and physics-enabled hand models [38].

Haptic force-feedback—Software control of the 3D Systems stylus is facilitated through the Open Haptics for Unity plugin, which allows for integration of various 3D haptic interactions in the Unity environment. The plugin comprises several components, including the Quick Haptics micro API, Haptic Device API (HDAPI), Haptic Library API (HLAPI), Geomagic Touch Device Drivers (GTDD), and additional utilities [39, 40].

The structure of the Open Haptics plugin for Unity differs from the native version. Instead of using Quick Haptics for haptics and graphics, this package employs the OHToUnityBridge DLL to establish communication between the Unity controller, the HapticPlugin script written in C#, and the HD/HL APIs written in the C programming language. Analysis of the dependencies of OHToUnityBridge.dll revealed that this library directly invokes the HD, HL, and OpenGL libraries, without relying upon Quick Haptics [39, 40].

2.2.3 Environment description

The rest of this section outlines the four primary components of the engineered prototype [32, 36].

  1. Scene loader—The Unity scene labeled as the “scene loader” is not directly accessible for selection by the user. Instead, it functions as a container, encompassing not only objects from the sub-scenes but also serving as a wrapper for the application’s scene management system.

  2. Scene selector—As illustrated in Figure 6, upon launching the demo application, the initial “splash” scene known as the “Scene Selector” is loaded additively into the Scene Loader. This scene includes various selectors that gather information regarding intended use of the main sub-experiences. The user is presented with pillars topped with buttons, a lever, and a canvas containing touchable User Interface (UI) buttons.

    The lever serves the purpose of indicating the user’s chirality, determining the dominant hand to be used with the haptic stylus. On the canvas, there are interactible entries representing previously saved states of sculpting and carving sessions. When a session concludes, its state is saved and can later be reloaded, allowing the user to resume exploration.

    The left button, labeled “Haptic Sandbox,” is used to load one of the sub-experiences, while the button labeled “Sculpting & Carving” is used to load the other sub-experience, which purpose is further explained below. In this scene, the user relies on hand tracking, gestures, and physics to interact with the virtual environment. Additionally, a palm-up “put-myself-there” gesture teleports the user within the designated play area.

  3. Haptics sandbox—As shown in Figure 7, the “haptics sandbox” offers the user a range of haptic simulations comprising five distinct stages, showcasing different capabilities of the haptic mechanism.

    The first stage presents a block shape-sorting experience. Users can teleport to an area in front of a table with various shapes to be inserted into corresponding cutouts in a pre-cut recepticle. Each block has a unique shape and a unique correct cutout, resembling a children’s toy. Positioned comfortably in front of the desk, users can grasp the physical haptic stylus, mirrored by a congruent shape in the virtual space. Real physical movements are then projected into the virtual environment. When users palpate a “touchable” object in the virtual space, the haptic device responds by mechanically locking or constraining movement and rotation, simulating contact with a physical object. Additionally, blocks can be picked up by pressing a button on the stylus, virtually grabbing them, and lifting them. Simulation of weight and mass of virtual objects is achieved by the haptic plugin’s ability to translate Unity material physical properties into attributes and parameters processed by the haptics engine, which controls the servomotors in the assembly.

    Another stage features two balloon-like sculptures. One sculpture consists of outer and inner spheres. The outer layer represents a relatively weak material that can be punctured with a certain amount of force, akin to popping a balloon with a pin. Once the outer layer is pierced, the stylus encounters a solid, impenetrable inner sphere, allowing users to feel its shape across the surface. When users want to pull back out of the outer layer, they must exert the same amount of force as when popping in. The other sculpture, instead of resisting touch, attracts the stylus to its surface and restricts stylus movement to match its shape. This sensation is akin to dragging a magnetic stick over a metallic surface. Sliding the stylus tip across the surface is effortless, but to detach it, a certain amount of force must be applied to overcome the “stickiness.” If the force exceeds the virtual-magnetic attraction, the stylus breaks away from the spherical shape.

    At the next stage table, users are presented with two angled boards, representing virtual materials simulating glass and wood. This experience highlights the haptic’s ability to simulate textures and smoothness. By teleporting to the desk area, users can use the stylus to touch these two boards, comparing the tactile sensations.

    Lastly, a table featuring five differently colored capsules is presented to the user. Each capsule represents a distinct tactile effect, allowing users to experience elastic springiness, viscosity, vibration, constant force, and friction effects. As a bonus feature, users also have the opportunity to palpate the Stanford bunny, a commonly used model for benchmarking 3D software due to its representative nature.

  4. Sculpting and carving—The second available space for the user to enter is the “Sculpting and Carving” scene. Users have two options: They can either select a previously saved instance of the scene on the canvas and load it by pressing the physics-reactive button, or if no saved instance is selected and the button is pressed, the simulation starts from a fresh state with no preloaded model. Upon loading, users are initially presented with a floating panel, an empty plane, and a button pillar that allows them to return to the scene selection.

    The floating panel consists of four buttons, each corresponding to one of four basic shapes: cube, cylinder, sphere, or plane. These shapes can be manipulated in terms of scale and orientation using bimanual manipulation techniques interpreted by the MRTK, described earlier in §2.2.2. These manipulations are performed through ray-casting, whereby users align a reticle over an object and then interact by grabbing a corner for scale manipulation or the center of an edge for horizontal or vertical axis rotation. If an object is positioned too far beyond the near field to be directly grabbed by hand, users can resort to ray-casting-based interactions by pinching their fingers when a reticle beamed from one’s hand collides with the object. These interaction methods are illustrated in Figure 8. Additionally, every object is touchable, and the haptic stylus can trace the shape of each object. However, for positional translation over long distances, users must be proximate to the object, as the stylus’s mechanical limitations restrict its arm from achieving large positional movements.

    In addition to the objects, a plane located in front of the button pillar can be shaped using the haptic stylus. When users teleport close to the plane, they can place the stylus on top of it, press the primary selection button, and directly manipulate its surface, as shown in Figure 9.

Figure 6.

Scene selector (splash scene).

Figure 7.

Haptics sandbox scene.

Figure 8.

Immersive haptic modeling: (A) object instantiation; (B) ray-casting-based interaction; (C) object scaling; and (D) collision-based interaction.

Figure 9.

Virtual sculpting.

Advertisement

3. Experimental validation

3.1 Pilot experiment

In this section, we outline two conditions under which participants in a subjective experiment experienced haptic feedback. The purpose of these tests was to identify shortcomings in the current implementation and to assess whether the ambulatory design enhances perceived immersion.

3.1.1 Conditions: Seated (with desktop PC monitor) and mobile VR (with HMD)

The baseline condition involved connecting the haptic device to a standard desktop computer with a monitor. In this seated experience, participants played a game of Jenga where they could pick up and remove small blocks. These blocks were affected by physics, including simulated gravity, so pushing the virtual tower with the stylus would cause it to react accordingly, such as shifting or toppling.

In contrast, the room-scale (ambulatory) condition involved each participant donning our harness (with guidance) for the immersive experience and its various segments, as described in §2.2.3.

3.1.2 Procedure and controls

In the baseline condition, participants were initially introduced to the haptic device, including its controls and expected behavior. They were then shown a demonstration of a game of Jenga. This task had no time limit or specific quantitative objective; it aimed to familiarize each participant with the concept of using haptic feedback devices in a desktop setting.

In the room-scale condition, testers were first introduced to the combined VR headset and force-feedback stylus used in the previous segment. They were then guided through the haptic sandbox, briefly introducing each segment. Similarly, when transitioning to the sculpting and carving scene, subjects received instructions on controls and capabilities, and were given opportunity to explore its features. There were no time limits, quantitative objectives, or specific goals for these tasks.

Throughout the experiment, each participant experienced the same desktop scene on the same computer, with the same monitor, and in the same seating position. Additionally, every ambulatory tester wore the same harness, laptop, and VR headset. They had an opportunity to explore the same sections of the haptic sandbox as in the room-scale condition.

3.1.3 Participants

A total of 8 individuals took part in the pilot experiment, with ages ranging from 19 to 35. Participants comprised six males and two females. Their levels of experience with VR varied, ranging from no experience at all to owning an HMD and occasionally using it. This diversity allowed us to observe how intuitive the experience was and to determine if inexperienced users required more guidance than experienced users. Regarding haptic devices, participants had varying levels of experience, but the majority had no experience or had only tried them a few times. Regardless of prior experience, the need for guidance was similar due to the novelty of the featured system. This suggests that participants required more instruction regarding operating the haptic stylus compared to that using the headset alone. However, once basic interactions were explained, no further guidance was needed.

3.1.4 Data acquisition and composition

Following the experience with both setups, participants were given a questionnaire to complete. The questionnaire consisted of assertions related to their impressions, and participants were asked to indicate their level of agreement on a quantified Likert scale. Furthermore, participants were asked a few additional questions regarding prior experience with haptic devices, virtual reality, and CAD software.

3.1.5 Results

Participants’ quantified impressions of the immersiveness of the desktop experience varied, but we were able to conclude that users generally had neutral or slightly positive feelings of immersion while playing the simulated Jenga game. Some concerns were raised about the weight of the setup, and several participants indicated a comfortable wearability time of a quarter to half an hour. Female participants specifically noted that the front of the harness should be softer by adding more padding between the vest and the aluminum plating. Overall, the response was more neutral than negative. It was encouraging to confirm that the simulation of tactile feeling was considered close to a realistic sensation. Most users agreed that the simulation provided an experience comparable to natural touch sensations. However, some users expressed disappointment with the device intensity, reporting that certain effects were not strong enough to be on par with real-life sensations.

When participants were asked about immersiveness of the VR mode, the feedback was overwhelmingly positive. Despite the weight and occasional discomfort of the harness, the sensation of immersion was not diminished. Comparing feedback on the form-factor immersiveness between the immersive and the desktop modes clearly showed that the combination of VR and haptic force-feedback contributed to overall immersion. Most users found the combination of inputs usable but somewhat tricky. A rotational reset function had been provided to help users realign the virtual stylus with its real-world affordance after teleportation. However, many participants noted that the congruence between the virtual stylus and its real-world counterpart sometimes drifted, and the rotational reset function could not be fully relied upon. Hardware limitations of the stylus and perceived inadequate intensity of haptic effects resulted in no participant choosing “Natural” as their impression.

Overall, hand-tracking was regarded as quite accurate, but the transition between using the haptic stylus and returning to using one’s dominant hand for gestures required users to hide their hand and then look at it again to resume the hand-tracking mode. Although participants considered this awkwardness as something they would “get used to,” it will be addressed in future versions. All participants agreed that this type of device arrangement could be used for CAD applications. When asked to rate overall experience on a scale from 1 to 10, participants provided quite positive feedback. The quality of our proof-of-concept received an average score of 8.5 out of 10. Any score below 5 would be considered unsuccessful, so achieving “success” with our prototype was gratifying. However, there is still ample room for improvement. The feedback we received in the form of complaints, compliments, and suggestions for future refinement and expansion was invaluable [32, 36].

3.2 Performance experiment

In a subsequent experiment conducted several months after the previous one, our focus shifted from characterizing absolute performance to confirming that the ambulatory performance was at least on par with the performance measured under the fixed condition. The hardware used in the improvement experiment remained the same, and no subjects participated in both experiments.

3.2.1 Conditions: Seated (with desktop PC monitor) and mobile VR (with HMD)

The baseline condition closely resembles that discussed in §3.1.1, with the inclusion of the block-sorting game as outlined in §2.2.3. However, the Unity application was re-engineered to ensure precise correspondence with the room-scale condition. The composition of this scene is shown in Figure 10.

Figure 10.

Performance experiment scene: both desktop and immersive conditions.

As mentioned in §3.1.1, the room-scale setup involved equipping each participant with our harness. However, instead of allowing them to independently explore the features, they were instructed to complete a set of predetermined tasks, as described following.

3.2.2 Procedure and controls

For the Jenga game, participants were instructed to remove as many blocks as possible from the tower within a 4-minute time limit without causing it to topple. If the tower toppled, they had the option to reset and start over. The highest number of removed blocks achieved from any number of attempts was recorded, along with the number of resets.

Similarly, in the shape-sorting game, testers were allotted a 4-minute time interval to sort the complete set of shapes. The score was incremented only if all shapes were successfully sorted, preventing players from selectively sorting only easier shapes.

Participants were divided into two groups: One group experienced only the ambulatory segment, while the other group exclusively engaged in the desktop experience. This partition was implemented to avoid learning effects and biased results favoring either of the two conditions based on a tester’s increased experience through the experiment.

The measured segment lasted approximately 10 minutes, excluding introduction of the experiment to each participant, warm-up session, and questionnaire completion. The warm-up session took about 2 minutes for each segment (4 minutes in total), and answering the questionnaire required up to 10 minutes. Overall experiment experience duration ranged from 20 to 30 minutes per participant.

In the desktop segment, each participant used the same computer, monitor, and maintained the same seating position (with only seat height adjusted to align the monitor with each tester’s eye level).

During the warm-up period of the ambulatory segment, our focus was primarily on adjusting the harness to ensure participant comfort and prevent any discomfort that could potentially affect results. Additionally, it provided an opportunity for the subject to become familiar with the new interface.

3.2.3 Participants

A total of eight adult participants volunteered for the ambulatory experiment. Among them, 3 (37.5%) were aged between 18 and 25, while the remaining 5 (62.5%) were aged between 26 and 35. In terms of gender distribution, 6 (75%) were males and 2 (25%) were females.

Similarly, eight adults took part in the desktop version of the experiment. Among them, 7 (87.5%) were aged between 18 and 25, and 1 (12.5%) was aged between 26 and 35. In this group, 5 (62.5%) were males and 3 (37.5%) were females.

All participants received compensation for their participation, receiving ¥1000 (approximately $8) for a half-hour session. All participants were right-handed and had a background in Computer Science or Software Engineering, making them well-versed in standard human-computer interaction practices. However, some participants had no previous experience with VR. They were introduced to the basic concepts and usage of VR by allowing them to walk in Meta Home, and they were shown how to enable the pass-through “Chaperone” functionality of the Quest 2 HMD (a.k.a “Guardian” for Oculus systems), which provides an optical representation of one’s physical environment using camera capture and video see-through, reassuring them about the minimal risk of accidental collision with real objects.

3.2.4 Data acquisition and composition

Data was collected by the experimental supervisor through direct observation using a stopwatch and Google Forms to record scores. In addition to the recorded data, each subject was asked to complete a User Experience Questionnaire (UEQ) after each measured segment. The UEQ consisted of 26 pairs of bipolar dichotomies, such as “complicated–easy” and “inventive–conventional.” Participants indicated their evaluation of the User Experience (UE) on a quantized Likert scale ranging from 1 to 7 [41, 42].

Furthermore, subjects were also asked to respond to 36 questions selected from the multidimensional scale Intrinsic Motivation Inventory (IMI). The IMI statements, such as “I was pretty skilled at this activity” and “This activity was fun to do,” were contradicted or confirmed by indicating level of agreement on a 7-step scale ranging from “not true at all” to “very true” [43].

The combination of observed data, scores, UEQ responses, and IMI assessments provided comprehensive overview of the participants’ experiences and subjective evaluations.

3.2.5 Results

Results of the UEQ were analyzed and categorized into six dimensions: Attractiveness, Dependability, Efficiency, Novelty, Perspicuity, and Stimulation. Scores were derived from a zero-centered seven-point scale (−3 to +3). Statistical analysis using the ANOVA method with the ‘ez’ library in R revealed no significant differences between the desktop and ambulatory versions of our application. However, when benchmarked against data from 21,175 individuals in 468 studies on various products, the desktop segment scored slightly lower than the ambulatory segment. Detailed results can be found in Table 1A and Figure 11.

(A) UEQ Results (p < 0.05)(B) IMI Results (p < 0.05)
DimensionStatisticsDimensionStatistics
AttractivenessF(1, 14) = 0.3378; p = 0.5703Effort and ImportanceF(1, 14) = 0.0029; p = 0.9574
DependabilityF(1, 14) = 0.0068; p = 0.9356Interest and EnjoymentF(1, 14) = 0.0204; p = 0.8886
EfficiencyF(1, 14) = 0.0333; p = 0.8578Perceived ChoiceF(1, 14) = 0.1639; p = 0.6917
NoveltyF(1, 14) = 0.0000; p = 1.0000Perceived CompetenceF(1, 14) = 1.4299; p = 0.2516
PerspicuityF(1, 14) = 0.7221; p = 0.4098Pressure and TensionF(1, 14) = 1.4246; p = 0.2525
StimulationF(1, 14) = 0.2355; p = 0.6350Value and UsefulnessF(1, 14) = 0.2056; p = 0.6572
(C) Performance results (Pr < 0.05)
Independent variableStatistics
Max. number of removed and stacked Jenga blocksχ2=0.1763; Pr>χ2=0.6746
Number of Jenga trials (resets +1)χ2=0.0044;Pr>χ2=0.9471
Number of filled shape sorter boardsχ2=1.9884; Pr>χ2=0.1585

Table 1.

Experiment results: (A) user experience questionnaire; (B) intrinsic motivation inventory; and (C) performance.

Figure 11.

Performance experiment (desktop and ambulatory conditions)—Compiled UEQ results of zero-centered seven-point scale with ordinate axis truncated to positive interval [0,3]. Error bars correspond to confidence interval of 95%.

The IMI was used to assess intrinsic motivation across six subscales: Effort and Importance, Interest and Enjoyment, Perceived Choice, Perceived Competence, Pressure and Tension, and Value and Usefulness. ANOVA analysis of the experimental results indicated no significant differences between the desktop and ambulatory conditions. Results are presented in Table 1B and Figure 12.

Figure 12.

Performance experiment (desktop and ambulatory conditions)—IMI results; error bars correspond to confidence interval of 95%. A single irrelevant question and an irrelevant scale from IMI were excluded from the analysis.

Performance measurements were analyzed using the ANOVA function in R and the Type II Wald χ2 test for logistic regression analysis. Three performance metrics were considered: the maximum number of removed and stacked Jenga blocks, the number of Jenga trials (including resets), and the number of filled shape sorter boards. Analysis confirmed that the performance measurements, shown in Table 1C, did not exhibit any significant differences between the two conditions, as indicated by Pr(>0.1763) = 0.6746, Pr(>0.0044) = 0.9471, and Pr(>1.9884) = 0.1585, respectively. Since each Pr value is greater than the rejection threshold of 0.05, we can conclude that there were no significant differences in performance between the two conditions [32, 36].

3.3 Implemented enhancements and extensions

Based on the experiments summarized in §3.1.5 and §3.2.5, several limitations were identified that negatively impact user experience with the prototype. These observations provide valuable insights into areas that require improvement. Despite the haptic feedback enhancing immersiveness in this specific use case, it did not improve performance. This indicates that enhancing performance and user experience could potentially further enhance the sense of immersion and depth of presence. Addressing these concerns would involve improving comfort, enhancing realism of touch sensations, refining input usability, ensuring alignment accuracy, and facilitating smoother transitions between different interaction modes within the system experience.

3.3.1 Reducing weight

Improvement efforts focused on enhancing comfort aspects of the prototype and addressing software-related issues mentioned earlier. Weight reduction was a key objective during this phase. The main goal was to minimize the number of components attached to the harness. We reduced the weight significantly by relocating the laptop computer and the rear support plate, which were originally positioned at the back of the user. Our specific use case required maintaining untethered six degrees-of-freedom movement within the virtual space. Therefore, we needed to shift from wired to wireless connectivity between the Quest 2 HMD and 3D Systems Touch haptic interface.

3.3.2 Establishing wireless connectivity

The transition from Quest Link to Air Link, where an HMD acts as a thin client for PC, was a relatively straightforward process. However, it required upgrading our development environment, Unity editor, and its Oculus XR libraries to newer versions to ensure smooth and reliable functionality with Air Link. Unthetering the haptic device from wired to wireless architecture presented significant challenges. Initially, we explored the option of converting the connection from USB to Bluetooth at the hardware level, but ultimately decided to use USB over a wireless network. To achieve this, we utilized a VirtualHere server running on a Raspberry Pi 3, equipped with a Wi-Fi adapter capable of 5 GHz connection. VirtualHere allows the network to serve as a conduit for transmitting USB signals, effectively allowing USB over IP. This USB server solution is best for distributed deployment of USB devices over a local area network (LAN), without the need for wireline physical connection to a client machine. The USB device behaves as if it were directly connected to, in this case, a laptop computer, even though it is physically plugged into a remote server (Raspberry Pi 3). Consequently, existing drivers and software function seamlessly without requiring special modifications.

However, there is a slight latency that slightly impacts our implementation under certain conditions. As explained in §2.2.2, Unity utilizes the OHToUnityBridge.dll library, while vanilla applications from 3D Systems directly interface with the device through HD/HL framework. When the stylus is driven directly using HD/HL libraries, the latency and reliability of haptic feedback are indistinguishable from when the device is physically connected to the target machine. However, when using an application built in Unity, the additional runtime overhead of the translation layer, which involves converting native API calls through middleware to express physical properties of virtual objects using the stylus, introduces occasional noticeable delays in force-feedback response and subtle stylus jitter. Occurrence of these issues depends upon various factors, such as the environment and Wi-Fi signal quality, which may be beyond our control. Furthermore, even without perceptible issues, a slight degradation in the servoloop frequency, which facilitates bidirectional communication between the application and the device, can be observed when comparing wired and wireless communication. We plan to address these concerns and enhance the interface in future updates by exploring ways to minimize middleware overhead experience.

3.3.3 Power delivery

Deployment and management of wireless communication between the host laptop, client HMD, and stylus device allows for reduction in power delivery requirements for all devices involved. Previously, we were limited by the built-in battery of our host machine, which had a capacity of 90 Wh. Through various power-saving strategies, such as CPU undervolting and throttling clock speeds of the CPU and GPU, we could achieve a screen-on time of approximately 60 to 90 minutes. However, by eliminating the need for the user to carry a laptop computer, we can disable all power-saving measures, improve rendering performance, and achieve higher texture quality and visuals within the limitations of the Air Link function of Meta Quest 2.

The power delivery for the stylus device as described in §2.2.1 remains unchanged, utilizing an external power bank. However, the power bank now needs to supply power to the Raspberry Pi 3 as well, which reduces the stylus device’s power-on time. Previously estimated at approximately 12 hours, system-on time can now be estimated to be a maximum of about 10 hours. Since a 10-hour run-time of this system exceeds presumed continuous session duration for a single user, we decided to utilize the same external power source to extend the battery life of the HMD as well. Considering the average power consumption of the Quest 2 (4.7–7 W, Raspberry Pi 3 (1.3–3.7 W), and the 3D Systems Touch (18–31.5 W), the entire system can be expected to run on a single charge of a 20 Ah battery for approximately 4.5 to 8 hours, effectively quadrupling minimum system up-time. The improved usability time resulting from the weight reduction and ergonomic enhancements described in §3.3.1 only extends to 2 hours. Therefore, power requirements do not present a significant limitation for this system. All the hardware changes mentioned above and the overall hardware assembly are illustrated in Figure 13.

Figure 13.

Wireless system assembly diagram—“elevation” (profile).

3.3.4 Enhancing haptic feedback intensity

As previously mentioned, there were concerns regarding perceived intensity of certain haptic effects. To address this issue, specific adjustments were implemented to improve simulation of physical properties for virtual objects. The simulation process is handled by Unity’s physics engine, which compiles and renders simulated properties into forces exerted by the stylus. To create more realistic experience when interacting with virtual objects using the stylus, various properties were reviewed and modified.

These properties encompass elements such as the perceived weight of objects, their drag, bounciness, friction coefficients of different materials, smoothness, stiffness, and intensity of damping when the stylus comes into contact with an object. By fine-tuning these properties, our aim has been to enhance the immersive nature of the tactile experience, making it closely resemble real-life interaction. These adjustments enable users to perceive and interact with virtual objects in a manner that aligns with expectations, ultimately providing a more satisfying and engaging haptic and overall experience.

3.3.5 Alignment and reset function for stylus to avatar

Previously, there were challenges in maintaining consistent alignment and synchronization between the user’s virtual avatar and the stylus device. When utilizing locomotion features, such as hand-gesture-initiated teleportation, slight drift or rotational misalignment could occur between the virtual representation of the stylus and its physical counterpart. In the implementation described in §2.2.2, the MRTK was utilized to enable hand-tracking and gesture-operated teleportation. However, teleportation caused the avatar to independently rotate around its gravitation vertical axis (yaw), separate from the orientation of the stylus attached to the avatar. To address this discrepancy, a reset function was introduced, allowing users to realign the stylus and avatar (by simultaneously pressing two buttons on the stylus).

To accommodate the switch to the latest version of the Oculus XR Plugin, departure from the outdated MRTK was necessary. This switch involved reintegration of features previously provided by MRTK into the application. The newer XR Interaction Toolkit was employed for this purpose, aiding in the implementation of the updated locomotion system. From the user’s perspective, the locomotion function operates in a seamless manner, while resolving the issue of independent rotation between the avatar and stylus after each teleportation. The updated system ensures that the user and their stylus face the same direction as prior to initiating each teleportation event.

3.3.6 Transition between stylus use and hand-tracking

The integration of the XR Interaction Toolkit in Unity applications improved hand-tracking and controller tracking, addressing the problem of inconsistent transitions between stylus and hand-tracking for the user’s dominant hand. These adjustments resulted in smoother tracking accuracy and more natural hand gestures. All the aforementioned software changes and the overall software architecture are illustrated in Figure 14.

Figure 14.

Wireless software architecture.

3.3.7 Mixed reality pass-through

The Meta Quest 2’s Mixed Reality (MR) pass-through feature offers an advanced capability that enables users to integrate their real physical environment into a virtual reality experience. Utilizing built-in cameras of the headset, this feature captures live stereoscopic video of the real world and composites it around the virtual environment. This integration allows users to perceive and interact with ambient surroundings while wearing the VR headset. As a result, virtual objects can occupy the user’s physical environment, providing a blended virtual and real experience.

The MR pass-through feature not only allows users to see their real surroundings but also enables them to navigate their physical space, avoid obstacles, and interact with real objects while immersed in the virtual world. It enhances situational awareness, enhancing user safety and reducing collisions with physical objects. Furthermore, it enables users to incorporate real-world elements into virtual experiences for augmented reality effects.

When combined with haptic force-feedback devices like our prototype or similar devices mentioned in §1.4, the MR pass-through feature of the Meta Quest 2 goes beyond traditional VR experiences, offering a multimodal encounter that merges virtual and physical realms. This combination provides users with heightened sensory stimulation, allowing them to enjoy the realism and interactivity of a virtual environment while remaining fully aware of and engaged with physical surroundings. The result is a captivating and immersive experience that extends the boundaries of what is possible in the realm of purely virtual reality.

Figure 15 shows a monocular view of the MR pass-through from within the XR environment.

Figure 15.

The MR scene (monocular left half of binocular view) showcases virtual shape sorter and Jenga desks overlaid upon photographic pass-through imagery rendered as a skybox.

Advertisement

4. Future work

Currently, the haptic stylus’s base is not explicitly tracked; instead, it is positioned based on the user’s height and arm’s length. The distance from the user’s torso (X-axis) and chin (Y-axis) parameterizes the Unity scene, and the virtual representation of the stylus is offset from the anchor point of the headset. As a result, the position of the stylus’s base is referenced by the HMD’s position within the scene and is confined to the play-space limited by the Quest 2’s Guardian mechanism. However, this setup poses a limitation. When a user leans sideways without moving their hips, the virtual stylus moves alongside this movement, while the real stylus remains in the same place, creating a tracking disconnect. To address this limitation in the future, an additional pair of cameras for image or object recognition could be utilized. By incorporating technologies such as OpenCV or other image processing frameworks, we could analyze the real space surrounding the user and estimate the true position of the haptic interface. This improvement would enhance overall tracking accuracy and provide more realistic experience for users. Furthermore, the current system only allows tactile perception with virtual elements that are pre-made and part of the scene. Future improvements involving depth cameras or other environment-scanning technologies could enable realtime rendering of “aftermarket” real-world objects into simplified virtual representations. This extension would create elements within the realm of augmented virtuality, blending real and virtual objects. This concept has been partially tested by scanning a laboratory environment using an iPad Pro and its LiDAR sensor, followed by post-processing in Blender and import into a Unity scene. The potential outcome of such developments is the ability to immerse the user in a “portable room” regardless of actual physical location. This would open up exciting possibilities for various applications, such as remote collaboration, training simulations, and interactive experiences that combine the virtual and physical worlds.

Advertisement

5. Conclusions

In addressing the challenges outlined in §2.1, we developed a free-range haptic interface that enables untethered six degrees-of-freedom in virtual reality, providing small- and medium-scale force-feedback. Through the transformation of a 3D Systems Touch haptic device into an ambulatory version and development of supporting software, we achieved literally tangible results in haptic technology. This innovation allows for immersive experiences in fields such as CAD, gaming, and virtual simulations, presenting previously intangible phenomena in a palpable manner. Integration of spatially flexible force-feedback displays offers new possibilities, such as ambulatory interaction with extended springs or realistically simulating organ transplants by providing haptic force-feedback in space.

To evaluate effectiveness of our solution, we conducted two experiments. The first assessed the precision, ergonomics, stability, and usability of our hardware and software, revealing certain deficiencies. However, despite these limitations, the overall results indicated the potential of combining haptic force-feedback and ambulatory VR to enhance immersion in free-range virtual environments. The second experiment focused on user experience and performance evaluation, comparing the ambulatory setup to the traditional stationary version. No significant differences were found across measured dimensions. Considering the limitations and the identified challenges from the first experiment, the absence of significant differences should not be regarded as a negative outcome. Instead, it highlights the potential of the ambulatory setup to surpass the traditional desktop version in terms of user experience and performance as hardware and software issues are addressed.

Moreover, we made substantial improvements in the wearable force-feedback mechanism and incorporated an MR pass-through feature. These enhancements encompassed weight reduction, wireless connectivity, power delivery, haptic feedback intensity, stylus alignment, and smooth transitions between stylus use and hand-tracking. Introduction of MR pass-through has been particularly impactful, as it allows users to merge real-world environments with augmented virtual elements. This integration softens boundaries between virtual and physical realms, creating coherent multimodal experience. The prototype, refined with these advancements, holds encouraging potential for further exploration in MR applications, presenting new opportunities for interactive immersive experiences.

Advertisement

Acknowledgments

We express our gratitude to Julián Alberto Villegas Orozco, Camilo Arévalo Arboleda, Georgi Mola Bogdan, Isuru Jayarathne, Wen Wen, James Pinkl, and our experimental subjects for their valuable support and feedback.

Advertisement

Abbreviations

APIapplication programming interface
ARaugmented reality
CADcomputer-aided design
DLLdynamic-link library
EMGelectromyography
HCIhuman-computer interaction
HDAPIHaptic Device API
HLAPIHaptic Library API
HMDhead-mounted display
IMIIntrinsic Motivation Inventory
LANlocal area network
MRmixed reality
MRFmagnetorheological fluid
MRTKMixed Reality Toolkit
PGMpneumatic gel muscle
RBFradial basis function
SDKsoftware development kit
TUItangible user interface
UEuser experience
UEQUser Experience Questionnaire
UIuser interface
USBUniversal Serial Bus
VEvirtual environment
VRvirtual reality
XRextended reality

References

  1. 1. Link E. US1825462A - Combination training device for student aviators and entertainment apparatus. 1931. Available from: https://patents.google.com/patent/US1825462A/
  2. 2. Kato H, Billinghurst M, Poupyrev I, Imamoto K, Tachibana K. Virtual object manipulation on a table-top AR environment. In: Proceedings IEEE and ACM International Symposium on Augmented Reality. New York City, United States: Institute for Electrical and Electronics Engineers (IEEE); 2000. pp. 111-119
  3. 3. AdsReality A brief history of augmented reality. 2020. Available from: http://adsreality.com/history-of-augmented-reality-infographic/
  4. 4. VR society, history of virtual reality. 2020. Available from: https://www.vrs.org.uk/virtual-reality/history.html
  5. 5. Marr B The 5 biggest virtual and augmented reality trends in 2020 everyone should know about. 2020. Available from: https://www.forbes.com/sites/bernardmarr/2020/01/24/the-5-biggest-virtual-and-augmented-reality-trends-in-2020-everyone-should-know-about/
  6. 6. Enterprise Alliance, A. XR Industry Insight Report 2019–2020: Featuring Oculus Rift, HTC Vive, Intel, Nvidia and More - AREA. 2020. Available from: https://thearea.org/ar-news/xr-industry-insight-report-2019-2020-featuring-oculus-rift-htc-vive-intel-nvidia-and-more/
  7. 7. Lang B. 13 major new features added to Oculus Quest 2 since launch. 2022. Available from: https://www.roadtovr.com/oculus-quest-major-feature-updates-since-launch/3/
  8. 8. Mathur N. What’s New in VR Haptics? Livery Place, Birmingham: Packt Publishing; 2018. Available from: https://hub.packtpub.com/whats-new-in-vr-haptics/
  9. 9. Kreimeier J, Hammer S, Friedmann D, Karg P, Bühner C, Bankel L, et al. Evaluation of different types of haptic feedback influencing the task-based presence and performance in virtual reality. In: PETRA ’19: Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments. Vol. 19. New York, NY, United States: Association for Computing Machinery; 2019. pp. 289-298
  10. 10. Burdea G. Force and Touch Feedback for Virtual Reality. New York, NY, United States: John Wiley & Son, Inc.; 1996
  11. 11. Rosenberg L. A Force Feedback Programming Primer: For Gaming Peripherals Supporting DirectX 5 and I-Force 2.0. Aventura, Florida, United States: Immersion Corporation; 1997
  12. 12. Azmandian M, Hancock M, Benko H, Ofek E, Wilson A. Haptic retargeting: Dynamic repurposing of passive haptics for enhanced virtual reality experiences. In: Proceedings in CHI Conference on Human Factors in Computing Systems. New York, NY, United States: Association for Computing Machinery; 2016. pp. 1968-1979
  13. 13. Suzuki R, Hedayati H, Zheng C, Bohn J, Szafir D, Do E, et al. RoomShift: Room-scale dynamic haptics for VR with furniture-moving swarm robots. In: Proceedings in CHI Conference on Human Factors in Computing Systems. Vol. 20. New York, NY, United States: Association for Computing Machinery; 2020. pp. 1-11
  14. 14. Kang N, Lee S. A meta-analysis of recent studies on haptic feedback enhancement in immersive-augmented reality. In: ACM International Conference Proceeding Series. New York, NY, United States: Association for Computing Machinery; 2018. pp. 3-9
  15. 15. Kreimeier J, Götzelmann T. FeelVR: Haptic exploration of virtual objects. In: ACM International Conference Proceeding Series. New York, NY, United States: Association for Computing Machinery; 2018. pp. 122-125
  16. 16. Dangxiao W, Yuan G, Shiyi L, Yuru Z, Weiliang X, Jing X. Haptic display for virtual reality: Progress and challenges. Virtual Reality Intelligent Hardware. 2019;1:136
  17. 17. Choi I, Culbertson H, Miller M, Olwal A, Follmer S. Grabity: A wearable haptic interface for simulating weight and grasping in virtual reality. In: Proceedings of the Annual ACM Symposium on User Interface Software and Technology. New York, NY, United States: Association for Computing Machinery; 2017. DOI: 10.1145/3126594.3126599
  18. 18. Slater M. Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philosophical Transactions of the Royal Society B: Biological Sciences. 2009;364:3549-3557. Available from: https://royalsocietypublishing.org/doi/abs/10.1098/rstb.2009.0138
  19. 19. Wang D, Guo Y, Liu S, Zhang Y, Xu W, Xiao J. Haptic display for virtual reality: Progress and challenges. Virtual Reality & Intelligent Hardware. 2019;1:136-162. Available from: https://www.sciencedirect.com/science/article/pii/S2096579619300130
  20. 20. Hashizume S, Takazawa K, Koike A, Ochiai Y. Cross-field haptics: Push-pull haptics combined with magnetic and electrostatic fields. In: ACM SIGGRAPH Posters. New York, NY, United States: Association for Computing Machinery; 2016. pp. 1-2
  21. 21. Matsumoto K, Hashimoto T, Mizutani J, Yonahara H, Nagao R, Narumi T, et al. Magic table: Deformable props using visuo haptic redirection. In: SIGGRAPH Asia Emerging Technology. New York, NY, United States: Association for Computing Machinery; 2017. pp. 1-2
  22. 22. Girouard A, Teather R, McClelland J. Haptic feedback with HaptoBend: Utilizing shape-change to enhance virtual reality. In: Proceedings of Symposium on Spatial User Interaction. New York, NY, United States: Association for Computing Machinery; 2017. p. 150
  23. 23. Hwang I, Son H, Kim JAP. Enhancing music playing experience in virtual reality with mid-air haptic feedback. Vol. 1. IEEE World Haptics. 2017:213-218
  24. 24. Choi I, Sinclair M, Ofek E, Holz C, Benko H. Demonstration of CLAW: A multifunctional handheld VR haptic controller. In: The ACM Conference on Human Factors in Computing Systems, April. New York, NY, United States: Association for Computing Machinery; 2018. pp. 1-4
  25. 25. Whitmire E, Benko H, Holz C, Ofek E, Sinclair M. Demonstration of haptic revolver: Touch, shear, texture, and shape rendering on a VR controller. In: Proceedings in CHI Conference on Human Factors in Computing Systems, April. New York, NY, United States: Association for Computing Machinery; 2018. pp. 1-4
  26. 26. Tzemanaki A, Al G, Melhuish C, Dogramadzi S. Design of a wearable fingertip haptic device for remote palpation: Characterisation and interface with a virtual environment. Frontiers in Robotics and AI. 2018;5:62. Available from: https://www.frontiersin.org/articles/10.3389/frobt.2018.00062
  27. 27. Valentini P, Biancolini M. Interactive sculpting using augmented-reality, mesh morphing, and force feedback: Force-feedback capabilities in an augmented reality environment. IEEE Consumer Electronics Magazine. 2018;7:83-90
  28. 28. Kishishita Y, Das S, Ramirez A, Thakur C, Tadayon R, Kurita Y. Muscleblazer: Force-feedback suit for immersive experience. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). New York City, United States: Institute for Electrical and Electronics Engineers (IEEE); 2019. pp. 1813-1818
  29. 29. Fang C, Zhang Y, Dworman M, Harrison C. Wireality: Enabling complex tangible geometries in virtual reality with worn multi-string haptics. In: Proceedings in CHI Conference on Human Factors Computing Systems. New York, NY, United States: Association for Computing Machinery; 2020. pp. 1-10
  30. 30. Lang B. Facebook Reveals Latest Wrist-worn Prototypes for XR Input and Haptics. 2021. Available from: https://www.roadtovr.com/facebook-reveals-latest-wrist-worn-controller-prototype-for-xr-input-haptics/
  31. 31. Shim Y, Kim T, Lee G. QuadStretch: A forearm-wearable multi-dimensional skin stretch display for immersive VR haptic feedback. In: Proceedings in CHI Conference on Human Factors in Computing Systems. New York, NY, United States: Association for Computing Machinery; 2022. pp. 1-4. DOI: 10.1145/3491101.3519908
  32. 32. Kudry P, Cohen M. Development of a wearable force-feedback mechanism for free-range haptic immersive experience. Frontiers in Virtual Reality. 2022;3:12. DOI: 10.3389/frvir.2022.824886
  33. 33. Allcock P. bHaptics reveals TactGlove for VR, which it will showcase at CES 2022. 2022. Available from: https://www.notebookcheck.net/bHaptics-reveals-TactGlove-for-VR-which-it-will-showcase-at-CES-2022.589393.0.html
  34. 34. Estes A. Facebook’s new haptic glove lets you feel things in the metaverse. 2021. Available from: https://www.vox.com/recode/2021/11/17/22787191/facebook-meta-haptic-glove-metaverse
  35. 35. Stein S. Haptic gloves for Quest 2 are a small step toward VR you can touch. 2022. Available from: https://www.cnet.com/tech/computing/haptic-gloves-for-quest-2-are-a-small-step-towards-vr-you-can-touch/
  36. 36. Kudry P, Cohen M. Prototype of a wearable force-feedback mechanism for free-range immersive experience. In: ACM International Conference Proceeding Series. New York, NY, United States: Association for Computing Machinery; 2022. pp. 178-184. Available from: https://dl.acm.org/doi/10.1145/3538641.3561507
  37. 37. Technologies, U. About the Oculus XR Plugin — Oculus XR Plugin — 1.9.1. 2020. Available from: https://docs.unity3d.com/Packages/com.unity.xr.oculus@1.9/manual/index.html
  38. 38. Microsoft MRTK2-Unity Developer Documentation - MRTK 2 — Microsoft Learn. Available from: https://learn.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/mrtk2
  39. 39. Systems, OpenHaptics Toolkit Version 3.5.0 Programmer’s Guide. 2018. Available from: https://s3.amazonaws.com/dl.3dsystems.com/binaries/Sensable/OH/3.5/OpenHaptics_Toolkit_ProgrammersGuide.pdf
  40. 40. Systems, OpenHaptics Toolkit Version 3.5.0 API Reference Guide Original Instructions. 2018. Available from: https://s3.amazonaws.com/dl.3dsystems.com/binaries/Sensable/OH/3.5/OpenHaptics_Toolkit_API_Reference_Guide.pdf
  41. 41. Schrepp M, Hinderks A, Thomaschewski J. Construction of a benchmark for the user experience questionnaire (UEQ). International Journal of Interactive Multimedia and Artificial Intelligence. 2017;4:40. Available from: https://www.ijimai.org/journal/bibcite/reference/2604
  42. 42. Schrepp M, Hinderks A, Thomaschewski J. Design and evaluation of a short version of the user experience questionnaire (UEQ-S). International Journal of Interactive Multimedia and Artificial Intelligence. 2017;4:103. Available from: https://www.ijimai.org/journal/bibcite/reference/2634
  43. 43. McAuley E, Duncan T, Tammen V. Psychometric properties of the intrinsic motivation inventory in a competitive sport setting: A confirmatory factor analysis. Research Quarterly For Exercise and Sport. 1989;60:48-58. Available from: https://pubmed.ncbi.nlm.nih.gov/2489825/

Written By

Peter Kudry and Michael Cohen

Submitted: 29 June 2023 Reviewed: 17 July 2023 Published: 29 September 2023