Open access

Haptic virtual reality assembly – Moving towards Real Engineering Applications

Written By

T. Lim, J.M. Ritchie, R. Sung, Z. Kosmadoudi, Y. Liu and A.G. Thin

Published: 01 April 2010

DOI: 10.5772/8695

From the Edited Volume

Advances in Haptics

Edited by Mehrdad Hosseini Zadeh

Chapter metrics overview

3,452 Chapter Downloads

View Full Metrics

1. Introduction

The use of virtual reality (VR) in interactive design and manufacture has been researched extensively but its practical application in industry is still very much in its infancy. Indeed one would have expected that, after some 30 years of research, commercial applications of interactive design or manufacturing planning and analysis would be widespread throughout the product design domain. Similarly, investigations into virtual environments (VE) for assembly and disassembly tasks have been carried out for many years. Given the availability of moderately-priced high performance computing technology, many of these virtual manufacturing interfaces - which only stimulate the visual senses – have made actual physical contact during product development an increasingly rare occurrence.

“We’re losing that tactile feel that we had before, and now we’re trying to bring it back.” Mike Levin, Vice President, Immersion Corporation (Immersion Corporation, 2008).

The first haptic device was developed and made commercial in the early 1990s (Salisbury et al., 1995). Today, haptics exists in many forms from electronic handheld devices to tele-operated robots. Yet outside of the research and engineering community, haptics remain a virtually unknown concept.

How will haptics and VR change the way we interact with the virtual world and how would it influence the way application developers and users (e.g. engineers, doctors, gamers, etc.) embrace the digital era? Already, entertainment and emerging online social networks have richly rendered 3D environments such as Second Life (Linden Lab, 1999). What these environments lack though is the ability to navigate, manipulate and feedback 3D information kinaesthetically.

Virtual reality is a better understood concept with equally extensive research. However, one of the major but less well known advantages of VR technology pertains to data logging. For engineering purposes, logging the user provides rich data for downstream use to automatically generate designs or manufacturing instructions, analyse design and manufacturing tasks, map engineering processes and, tentatively, acquire expert knowledge (Ritchie et al, 2006). The authors feel that the benefits of VR in these areas have not been fully disseminated to the wider industrial community and - with the advent of cheaper PC-based VR solutions – perhaps a wider appreciation of the capabilities of this type of technology may encourage companies to adopt VR solutions for some of their product design processes. It is envisaged that the notion of unobtrusive logging can similarly be applied to other domains.

This chapter will describe applications of haptics in assembly demonstrating how user task logging can lead to the analysis of design and manufacturing tasks at a level of detail not previously possible; as well as giving usable engineering outputs. The study involves the use of a haptic feedback device (Phantom, Sensable Technologies, 1993) and a 3D system to analyse and compare this technology against real-world user performance. Through detailed logging of tasks in a haptic VR environment the study shows considerable potential in understanding how virtual tasks can be mapped onto their real world equivalent as well as showing how haptic process plans can be generated. The chapter also investigates methods to quantify how the provision of haptic feedback affects user performance, the enhancements from a physiological perspective and whether, through an association with game-based approaches, the working environment can be made more engaging. The chapter concludes with a view as to how the authors feel that the use of haptic VR systems in product design and manufacturing should evolve in order to enable the industrial adoption of this technology in the future.

Advertisement

2. Background

Various researchers have investigated sense of presence measurements simulation validity and human performance, in an effort to assess the effectiveness of force-feedback VR applications.

A classic example relates to peg-in-hole insertion operations. Insertion operations are an important aspect of assembly. Tight tolerances between both objects involved in the insertion, and associated positioning accuracies require some level of compliance, trajectory and force control. Ho and Boothroyd (1979) studied the intraposition of a peg into a hole and the circumposition of a part with a hole onto a peg. Their objective was to elicit chamfer designs that will minimise insertion times and, hence, overall assembly times. Rosenburg (1994) carried out an empirical study where participants were asked to execute a peg insertion task through a telepresence link with force-feedback. Five different haptic overlays were tested which included virtual surfaces, virtual damping fields, virtual snap-to-planes and snap-to-lines. The results indicated that human performance was significantly degraded when comparing telepresence manipulation to direct in-person manipulation. However, by introducing abstract haptic overlays into the telepresence link, operator performance could be restored closer to natural in-person capabilities. The use of 3D haptic overlays was also found to double manual performance in the standard peg-insertion task.

In the mid 1990s commercial force feedback interfaces appeared; such as the Phantom arm (Massie & Salisbury, 1994) which allows user interaction with virtual environments through a stylus. Gupta et al. (1997) investigated the benefits of multimodal simulation using VE technology for part handling and insertion compared to conventional table-based methods, as presented by Boothroyd et al. (2002). Their results showed that assembly task completion time increased in proportion to the complexity of the assembly operations required. However, the measured times were roughly double those required to carry out the operation in the real world. Although they employ two haptic arms their study was restricted to 2D simulations of the insertion operation. Significantly for the work reported in this paper the authors speculate that one of the contributory factors to task completion time was the lack of co-location.

For human computer interaction (HCI), Fitts’ Law (Fitts, 1954) has generally been used as a quantitative means with which to measure the performance of human motor control of simple task. Fitts derived a quantitative predictor for the movement time needed for the successful completion of 2D targeting peg-in-hole-type tasks. There was, however, no consideration of shape at any stage.

Bayazit et al. (2000) reported that the lack of truly cooperative systems limits the use of haptic devices involving human operators and automatic motion planners. They presented a ‘hybrid’ system that uses both haptic and visual interfaces to enable a human operator and an automatic planner to cooperatively solve a motion planning query. By manipulating a virtual robot attached to the Phantom haptic device a sequence of paths were generated and fed to the planner. Haptic interaction comprised of tracking user motion, collision detection between haptic probe and virtual objects, computing reaction forces, and force rendering. An obstacle-based probabilistic roadmap method was used in conjunction with a C-Space toolkit to filter the haptically-generated paths and generate collision-free configurations for the robot.

Unger et al. (2001) described an experimental arrangement for comparing user performance during a real and virtual 3D peg-in-hole task. The task required inserting a square peg into a square hole via a 6 degree of freedom magnetic levitation haptic device and visual feedback. The goal was to understand human manipulation strategies. Their results indicate that haptic senses can discriminate between very fine forces and positions; however, it was found that overall task performance with real objects is best.

The sensory feedback capability of haptics lends itself naturally to tasks that require manual manipulation. Adams et al. (2001) conducted experiments to investigate the benefits of force feedback for VR training of assembly tasks. Three groups of participants received different levels of training (virtual with haptics, virtual without haptics, and no training) before assembling a model biplane in real world environment. Their results indicated that participants with haptic training performed significantly better than those without.

The Haptic Integrated Dis/Re-assembly Analysis (HIDRA) is a test bed application focused primarily on simulation of assembly procedures with force-feedback (Coutee et al., 2001). Their intention was to provide a development perspective relevant to haptically enabled simulations. The research efforts of Seth et al. (2005) fall into the similar assembly/disassembly category of analysis via visualisation with haptic force feedback. These reported examples are particularly useful for applications that provide tactile information regarding assemblability at the design stage. However, there is little evidence of data logged in order to output assembly instructions.

Similar to the vein of research here is the work by Gerovichev et al. (2002) on the evaluation of visual and haptic feedback for training needle insertion tasks in medicine. Promising results showed that the addition of force cues, along with real-time visual feedback, improved performance.

Yoshikawa et al. (2003) presented a methodology for observing human skill in a virtual space configured with haptic technology. A comparison between a real world square peg-in-hole task and a 2D simulation was performed. The virtual space incorporated dynamics and surface friction characteristics. Results indicated that stability of the haptic system can be improved with analogue circuitry so that human skills are better represented in the virtual environment. Bashir et al. (2004) developed a system to assess the influence equipment and its arrangement on the force, visual and augmented feedback and how this influences task completion times. Their experiment involved picking an object, placing it against a vertical surface and inserting it into a hole with a sliding motion. The effects of mass and clearance were not considered. Their results indicated 45% prolonged completion times with force feedback compared to real tasks.

Amirabdollahian et al. (2005) underlined the advantage of using assistive technologies to measure the effectiveness of a medical therapy regime. The peg-in-hole haptic assessment was the study of choice for quantifying upper limb skills. The set-up consists of a large virtual table with two identical cylindrical holes and a cylindrical peg that was to be repeatedly inserted by alternately moving between each of the holes with a Phantom device. Physical properties of the peg and hole such as diameter, peg weight and height, were taken into account. Position, velocity and reaction forces were logged at a sampling rate of 1000Hz. Inconclusive results were obtained but further clinical trials are being undertaken to investigate the usefulness of the haptic system as a means of assessing human performance, in particular arm skills and coordination.

Recent research points towards developing architectures for collaborative haptic virtual environments (CHVEs). The Collaborative Haptic Assembly Simulator (Iglesias et al., 2006) is one reported work that investigates assembly/disassembly simulation of mechanical components in a collaborative virtual environment. The system has the potential to manage large assemblies; unfortunately, they do not appear to have stored and managed the history of movements. A review on the application of haptics in nano robotics illustrates the advancement of VR and haptics (Ferreira & Mavroidis, 2006). However, only the exploratory influence and the associated sensory advantages of tactile feedback are reported.

There have been numerical studies evaluating the performance of haptic technologies in interaction with VR, the optimisation of the kinesthetic device design (Fritschi et al., 2008), as well as the human haptic perception (Bresciani et al., 2008). However there has been limited exploration into measuring the experience of haptic interaction. Haptic systems may provide force feedback and inherit the dynamics and movements of the tool they simulate but whether this provides the same feeling as it would had in reality needs further investigation. Comparing similar tasks in VR and real life based on the bio-mechanical reaction of the humans can lead to a scientifically more accurate and realistic haptic simulation (Kocherry et al., 2009).

While most of the published work on VR applications with force feedback shows the benefits of haptics, they do not discuss the automatic generation of qualitative information derived from assembly plans (syntax or semantics) developed within simulations in the virtual environment. Generally, haptics remains as a facilitator in guiding spatial exploration rather than as an output of task planning and in more general terms, manufacturing information. Extrapolating the cognitive procedures relating to assembly tasks (and even tacit exploration of the virtual components) during user interaction will provide information to better a product’s design for manufacture and assembly (DFMA). User interactions captured through the haptic VR system provide a rich source of data that underlines knowledge, experience and intent. By analyzing this information optimizations can be made to procedural tasks and training strategies early in the development phase while making users aware of any faults.

The logging and reuse of associated information as an engineering task analysis tool within haptic VR environments is central to this work; indeed, the application of these methods is similar to a number of engineering task analysis applications covering both design and manufacturing assembly processes as well as early knowledge acquisition (Ritchie et al., 2006). Following a successful pilot study by Lim et al. (2006), while statistically inconclusive, it has shown that Design for Assembly (DFA) components had an impact on task completion time in the virtual environment. The initial study also exposed several inadequacies of the test bed, for example its functionality and ease of use, and aspects of human factors such as the associated cognitive and physiological perspective of how people perceive shapes in stereo and non-stereo modes. With this in mind the objective was to carry out the human factors’ evaluation of a haptic assembly system via the use of a comparative assessment of virtual, virtual/haptic peg-in-hole assembly tasks against real world benchmarked equivalents. Further, measuring the physiological response using electromyography (EMG) can aid in the understanding of kinaesthetic responses between haptic VR and the real world tasks to potentially advance the state of the art and to achieve better and more accurate correspondence to reality for haptic based systems.

Advertisement

3. Motion Chronocyclegraphs

As advances in technology allow for more complex geometries to be manufactured, so too has the degree of complexities increased within component assembly. Therefore to automate assembly procedures it is useful to understand the cognitive insight of the human operator. A method that allows for such analysis is to track user-object interaction. The data obtained can then be plotted as a time-dependent profile describing motion together with position, orientation and velocity.

Figure 1.

Gilbreth's Time and Motion Study (courtesy of Johnson & Ogilvie, 1972).

Therbligs are a set of symbols developed by Frank Gilbreth (Price, 1990; Johnson & Ogilvie, 1972) during the early 20th century to study assembly motions, where each symbol represented the mental and physical processes involved during an assembly task. Figure 1(a) shows the 18 therblig units represent a set of fundamental motions required to perform a manual operation: Search; Find; Select; Grasp; Hold; Position; Assemble; Use; Disassemble; Inspect; Transport loaded; Transport unloaded; Pre-position for next operation; Release load Unavoidable delay; Avoidable delay; Plan; and Rest. As therbligs map onto each individual operation task, by analysing the therblig units associated with a process, unneeded movements can be eliminated to optimise and make efficient any task. For example, when numerous ‘delay’ therbligs associated with a particular assembly operation are evident then the efficiency for that specific task will have to be improved.

Gilbreth also devised chronocyclegraphs for motion studies (Price, 1990). The method involves attaching a light source to the hands of a person performing an assembly task. Using long-exposure photography of the whole assembly process, a result as seen in Figure 1(b) is obtained. This result displays the path, known as the chronocyclegraph, that the user’s hands have moved through during the assembly task and helps identify areas of inefficient movement by the user. By letting the light source flash at a known frequency, it can also help determine the velocity and acceleration of the hand movements.

Indeed, there is much to be gained from Gilbreth’s seminal work on time and motion study. However, there is no literature that shows that these have been applied in any VR and/or haptic engineering environment.

Advertisement

4. Experimental methodology

Assembly is, classically, one of the most extensively studied manual processes in manufacturing. One strand of this work aims to quantify assemblies by analysis of the sequence of operations required to build a component. For example, Figure 2 illustrates the relative time required for manual insertion of a peg into a hole for various geometries. Taking the insertion of a cylindrical peg into a round hole as its baseline time (i.e. 100%) Haeusler (1981) reports a German study that estimated the relative times required to assemble different geometries. He estimated that the insertion of a round pegs into a chamfered hole took only 57% of the time required to complete the “nominal” case.

Figure 2.

Design for assembly (DFA). The diagram shows pegs and holes with varying feature shapes and relative assembly times (Haeusler, 1981).

DFA methodologies have quantified the relative times of real-world assembly tasks: grasping, acquisition, manipulation, and insertion. Could haptic assembly performance be benchmarked against previously quantified assembly times? This forms the rationale for the peg-in-hole assembly task in this study. The assembly process can be subdivided into three states; picking, placing and motion.

The peg-in-hole task requires inserting a peg into a block with a hole. Participants perform this both as a virtual 3D task where they are provided with different combinations of visual and force cues (through a haptic device) and the real world equivalent tasks.

The primary objective was to investigate simulator fidelity by comparing the time taken to carry out six different assembly tasks in a haptic VE with the following characteristics; rigid body dynamics, stereo display and haptic feedback, and so assess the relative impact of each technology on user cognitive and physiological performance. The peg-in-hole experiment was planned as a precursor to a more challenging assembly that involved a geared pump comprising five components.

4.1. Virtual task

Table 1 presents the experimental design for the virtual task. A binary label was used to indicate whether collision detection and stereovision was switched on (1) or off (0). There are six primary experimental steps, which are ordered such that tests are performed with/without collision detection (C/D) and with/without stereovision. The key feature of this experimentation was that the participants were totally unaware of any geometrical differences within components during this experimental programme.

Experiment ID C/D Stereo Block Peg People x Reps
Chamfer Chamfer
1 1 1 1 1 11 x 3
2 0 1 1 1 11 x 3
3 1 0 1 1 12 x 3
4 1 1 0 0 11 x 3
5 0 1 0 0 12 x 3
6 1 0 0 0 11 x 3

Table 1.

Experimental design.

Allocation:

  1. Participants were randomly allocated to experiments 1, 2 or 3.

  2. Subsequently each was also randomly allocated to 4, 5, or 6.

  3. Each experiment had an equal number of participants.

Training:

  1. Each participant was given five minutes to briefly familiarise themselves with the virtual environment and haptic feedback.

  2. Shape manipulation requirements.

  3. Pick and place a peg into a hole using the haptic device.

  4. Applied force displacement with the picked object onto a stationary object to experience haptic feedback.

Instruction:

  1. Each participant was asked to move a peg from its starting position to the block and insert it.

  2. The process was repeated three times.

  3. Each participant was then asked to complete a questionnaire after the experiment.

Analysis:

  1. Log files recorded timestamps for the pickup of each peg until last release. This provided a task completion time (TCT) in seconds for each repetition.

  2. Log file names were uniquely defined for each participant.

  3. Perform error analysis.

  4. Video recording for each participant was taken to give more insight into both behaviour and errors.

  5. Statistical analysis was subsequently used to investigate the null hypothesis that: “The variability in performance between task pairs would be similar.”

4.2. Real task

A similar number of participants were tasked to perform manual peg-in-hole insertions. Four different insertion routines were carried out in the following order:

  1. 1. Flat Peg/Flat hole (FPFH) – Insert flat end peg into a block with a chamferless hole.

  2. 2. Chamfered Peg/Flat Hole (CPFH) – Insert a peg with a 45 conical chamfered end into a block with a chamferless hole.

  3. 3. Chamfered Peg/Chamfer Hole (CPCH) - Insert peg with a 45 conical chamfered end into a block with a corresponding chamfered hole.

  4. 4. Flat Peg/Chamfer Hole (FPCH) - Insert flat end peg into a block with a 45 conical chamfered hole.

Each routine was repeated six times. The setup (Figure 3) ensured that the distance between the peg and hole was fixed (at 40mm from the centre of the peg to the centre of the hole in the block) and that each participant was positioned according to their comfort of reach and eye level.

Figure 3.

Real world experimental setup. The schematic is presented on the left where sensor 1 and sensor 2 times peg insertion and retraction cycles.

Timings were taken only when the peg was displaced from its holder and stopped once the subject released upon a successful insertion. Figure 4 details the dimensions for the block and peg used in the virtual world and for the real world experiment.

Figure 4.

Experimental block and peg (all dimensions in millimetres). The dimensional view on the left shows the peg and the block with its cross-sectional view along AA. The physical test block and pegs (diameters left to right - 14.85 mm, 14.48 mm and 13.54 mm) are shown on the right.

Advertisement

5. Implementation

The Haptic Assembly, Manufacturing and Machining System (HAMMS) was developed as a test bed to investigate and measure user interactions and response while performing various engineering tasks in a haptic VR environment. The hardware comprises a Phantom haptic device for interaction with the virtual environment, along with a pair of stereoscopic glasses (MacNaughton, 2008) for stereo viewing when required. The systems’ hardware and architecture is presented in Figure 5 and comprises the following components:

  1. Haptics Interface: Sensable Technologies OpenHaptics Toolkit (Sensable Technologies, 1993), which provides device control for the Phantom Desktop and Omni, and supports polygonal objects, material properties, and force effects.

  2. Graphics Interface: The Visualization ToolKit (VTK, 1998) is used for rendering graphics, image processing, and visualization.

  3. Physics’ Interface: AGEIA PhysX (AGEIA PhysX, 2008) technology provides the physics’ engine that includes an integrated solver for fluids, particles, cloth and rigid bodies.

Figure 5.

HAMMS hardware and dependencies.

Central to HAMMS is the physics’ engine, which enables rigid body simulations in real time. State changes within the physics’ environment update haptic rendering and vice versa. As haptic rendering relies on real time collision feedback from the physics engine, it is important that where possible convex hulls and/or primitive shapes are used to represent the objects in the physics’ environment. The most important issue to address is the synchronization between the haptic and physics loops. Essentially, the physics loop runs at approximately 30-60 Hz while to create realistic sensations the haptic loop requires 1000 Hz. To avoid instabilities in force rendering, the input device and any rigid objects are uncoupled. Instead, the system uses the changing states in the physical simulation to influence the forces associated with the haptic rendering. The resulting events are then visualized through VTK.

HAMMS logs data for each virtual object in the scene including devices that are used for interaction. The basic logged data comprises position, orientation, time stamps, velocity and an object index (or identifying number). Figure 6 illustrates the colour-coded therblig units adapted by HAMMS and its association to the logged data. By parsing through the logged data text files an assembly procedure can be automatically formulated.

Figure 6.

HAMMS colour coded therbligs. Large spheres signify a start event while small spheres represent motion. Green indicates search, find or rest. Blue represents selection and inspection. Red identifies control events such as grasping, holding, translation, dis/assembly operations. Note: The shadowed cylinder in the middle shows the original position of the translated cylinder to the right.

To visualize the data stream, large spheres are used to signify the start of an event, while smaller contiguous spheres indicate the direction, speed, and location of exploration or controlled displacements. Velocity changes are indicated by the separation of the spheres, i.e. sparsely spaced spheres equate to higher velocity. The line joining all spheres is referred here as the motion-time-line (MTL).

Advertisement

6. Experimental Results and Analysis

A peg-in-hole experiment was performed to investigate how users responded to a simplistic assembly procedure in a haptic virtual environment. This experiment was designed as a precursor to a more challenging pump assembly. Two sets of experiments have been prepared: a real world set up and a virtual reality set up where the participant is given the impression of “forces” through a haptic device. The primary objective was to compare two different assembly tasks in a virtual environment with the following characteristics; rigid body dynamics, stereo display and haptic feedback, and so assess the relative impact of the technology against real world equivalent tasks and its influence on how a task is completed. As mentioned previously, the participants were not informed about chamfers being on the peg or hole.

6.1. Virtual vs. Real

A total of 34 participants were recruited for the virtual and real world peg-in-hole task. Each subject was randomly allocated to two experiments listed in Table 1 and the results charted in Figure 7.

Figure 7.

Virtual and real world Peg-in-Hole task completion time (TCT) comparison. TCT for each experiment is indicated at the top of each column.

Chamfered Peg / Chamfered hole (CPCH): The results indicate the effects of augmented virtual environments through the use of force cues during an assembly task. Comparisons of stereo viewing show a gain of approximately 1.6 seconds on the overall task completion time (TCT). Where stereo is present but not collision detection, it was observed that participants took longer to align and insert the peg into the hole. This is reflected in the results (see SNCD in Figure 7).

Flat Peg / Flat Hole (FPFH): This experiment also presents a similar trend to the CPCH experiment. Though less pronounced comparisons against stereo viewing again show a reduction (approximately 1.2 seconds) for TCT.

Interestingly, TCT for both SNCD experiments are nearly identical. This was due to subjects spending the majority of time aligning the peg in the hole. Since no force cues were present to indicate that the peg was in contact with the sides of the hole, the subjects relied heavily on visual perception.

Importantly, the virtual and real world experiments illustrate that shape is one predominant factor in reducing TCT. It appears that perception of size and shape, in addition to visual stimuli on haptic perception, has exposed the power of visual dominance over the other senses. It is also evident that manual manipulation of objects in virtual environments improves with haptic feedback.

Damping effects: In order to consider the haptic equipment’s actual operational influence on performance, in a manner similar to that detected by Bashir et al. (2004), it was decided to investigate its inherent damping during operation. The architecture used for HAMMS meant that to approximate ‘reality’, phenomena such as gravity, restitution, material properties and friction were provided via a physics engine, which added an operational delay into the system. This adds further load to the system, and in the current development of HAMMS, generates a damped effect during object interaction (i.e. when using the haptic device to move an object). No motion damping effect is present while the user is touching virtual shapes; the haptic cursor response was instantaneous. Taking this into account, a series of displacement trials were carried out to establish a damping metric (Figure 8).

Figure 8.

Peg displacement test to determine haptic motion damping.

A total of 30 repetitions were executed and the average taken. The procedure involves picking up the peg, bringing it forward and placing it on a pedestal. The objects are reset automatically to their origins after each placement. The movements were carried out as quickly as possible. Each cursor path was recorded by logging cursor position and system time whenever a haptic device button message was processed by the operating system.

Figure 9 presents the results after the effects of haptic damping (approximately 2 seconds) were removed from the original TCT. While the new figures are still a minimum of twice the real world experiment times, there is close correlation to real world assembly experience. This also supports the findings of Gupta et al. (1997) mentioned previously who measured times roughly double that in the real world. The results also show that the Haeusler factor (Haeusler, 1981) - 57% chamfered peg/hole to flat peg/hole - equates to 61% in the haptic domain and 78% in the real world. More improvements to the system on memory management and more efficient rigid body dynamic algorithms could potentially improve realism.

Figure 9.

Effect of virtual environment damping on task completion time (TCT). TCT for each experiment is indicated at the top of each column. The non-damping times have been superimposed to indicate the difference in TCT.

6.2. Physiological effects of haptic VR interface

Haptic displays provide users with the sense of touch and manipulation of objects. The haptic modality is a correlation of tactile and kinesthetic perception (Fritschi et al., 2008). In order for a user’s grasp to be translated into free motion in the virtual mechanism, dynamic forces need to be calculated and returned to the user. An ‘ideal’ haptic device has a number of design criteria characterising its performance; dynamic properties, stiffness, output capability, workspace and extensibility (Ueberle et al., 2004).

In an effort to provide more intuitive interfaces, particularly for tasks involving exploration or manipulation of geometric entities, it is necessary to investigate users’ sense of presence. It has been shown that users within virtual environments can exhibit parallax issues when exploring virtual geometric spaces although they would have easily understood a similar space in the real world (Baker et al., 1993; Robinson et al., 2007). Research in perceptual-motor coordination suggests why this effect might be occurring (Beal & Loomis, 1995). For example, physically walking through a complex environment allows a person to keep track of their position and where other key locations are with respect to their own position within that environment (Brooks, 1992). Similarly being able to physically touch and manipulate an object makes its shape and structure much more vivid than passively observing the same object (Burdea, 1996). Human visual sensors are spatially localised (Derrington et al., 2004); in contrast, haptic and kinaesthetic sensations are intrinsically three-dimensional. Thus studying how mechanical-VR interfaces can support human-scale workspaces, human-level dexterity, and haptic interaction while providing quantitative 3D precision becomes more compelling. The experiment here investigates the sense of motion or manipulation and the ability of users to interact mechanically with computational artefacts via haptic VR.

Figure 10 shows the results of tactile feedback and how it has influenced the user’s intent. Object control time-lines are represented by a series of red spheres. Blue spheres indicate that the user was exploring the shape (touching or finding a picking location). Green spheres indicate that the user was wandering (i.e. no interaction with virtual objects).

Figure 10(b) clearly shows that without haptic feedback, tacit knowledge regarding the location of the peg as it passes through the hole is lacking, indicated by the sparsely separated object control lines (red spheres). However, when force cues are available, the user more accurately passes the peg through the hole, as shown in Figure 10(c) and (d). The closely converging control lines indicate this as the peg enters the hole. Note also how the user has gained confidence about the environment (or workspace) when tactile information is available. Compared to the closely spaced red control spheres in Figure 10(b), those shown in Figure 10(c) and (d) are well separated, indicating that the user’s motion and confidence has improved.

Figure 10.

Peg-in-hole motion chronocyclegraph. Frames b, c and d show four successive pick and place motions.

The vortex (twist) of the control lines indicates how the user is trying to orient the peg for a successful insertion. The amount of wavering in Figure 10(b) compared to the precise motions attributed to augmented tactile information of the latter experiments clearly show the user has learnt to appreciate force cues to complete the task.

The effects of stereovision can clearly be observed in Figure 10(d). The start (pick) event and the entry (insertion) event have been markedly improved as indicated by consistent picking and direction of motion. With stereovision the learning process is fast tracked, as there is better depth perception, reducing the guesswork during picking.

6.3. Haptic influence on motor control

The human body has a number of different sensory systems, namely somatic (touch-pressure, posture and movement, temperature, pain), vision, hearing, chemical (taste, smell) and vestibular (motion and position of head). VR by and large focuses on providing visual stimuli to the user whilst trying to avoid conflict with vestibular sensations that would result in dizziness or motion sickness in extreme cases. Haptic devices are used in conjunction with VR with intention of giving the user tactile feedback by providing touch and pressure stimuli to the digits and hands. However little is known about the nature of the user’s experience from a kinaesthetic point of view.

Video games are rich multimedia environments that seek to provide engaging interactive experiences. Until relatively recently the predominant focus has been on the richness of the graphical game play environment. Whilst games developers and manufacturers have always being developing peripherals in order to provide more engaging ways to interact with games rather than the simple joystick, it is only in the last few years that so called “body-movement controlled video games” or “ExerGames” have become a commercial success. The Nintendo Wii console is worth considering in this context in that games for this console compensate for a relative lack of graphical computing power with varied and rich sound feedback and the physical interaction required to play games on the console. Research into player’s experiences show that such an approach can facilitate an engaging experience without the highest possible degree of realism (Thin et al., 2009).

The term VR overtly implies an attempt to closely mimic reality. However, even in current systems, there is not a direct correspondence. The user of a VR environment can normally go beyond the confines of their human body in the equivalent real world and move through a virtual environment much more freely and easily, often at speed and defying gravitational and anatomical constraints. A shift in focus towards considering the nature of the user’s interaction and experience opens up new avenues of development such that by augmenting or enhancing the VE, the user would experience something that is more intuitive and responsive to their needs and intentions. Examples of such augmentation would include “snap to” functionality, visual cues and guides, split screens presenting alternative views or specific details and proximity awareness through the use of sound (haptic feedback does not need to be confined to tactile sensations).

The nature of the interactive experience and how meaning is constructed by the user will in part depend on how “natural” or otherwise their experience of the VR system feels. This is not to say that a user cannot learn to use new systems and approaches and fit in to them but in order to gain widespread acceptance and adoption, it is desirable to try and fit within the range of a given user’s existing experiences. Psychophysiological measurement techniques could potentially provide a way to evaluate how a user is sensing and responding to a given environment. Such techniques would be able to give measures of a user’s intentions via electromyograhic activity of skeletal muscles (EMG), brain activity (electroencephalography, EEG) and eye movement tracking (electrooculography, EOG). Insight into a user’s arousal and/or stress levels could also be gained (e.g. heart rate, temperature, blood pressures, galvanic skin response). Such physiological signals are likely to result in patterns of responses to different situations which are characteristics of certain subject responses, actions and also intentions.

A preliminary investigation was undertaken in order to assess the EMG response during both real world and virtual reality versions of the peg-in-hole task. Subjects sat in a seated position with the arm flexed and the elbow supported. EMG measurements were made of triceps and biceps activity whilst the movement of the arm was tracked using an electronic goniometer (Figure 11). The signals were acquired using a specialised physiological recording system comprising a set of optically isolated analogue to digital converters under software control and sampled at 1 kHz. Data was stored and analysed using digital chart recording software. Subjects performed the peg-in-hole task for one minute and in that time performed 30 insertions and removals at a uniform rate. The real and virtual world tasks were performed in a randomised order. A screen shot from the physiological recording system is shown in Figure 12.

Figure 11.

Investigating haptic influence on motor control through biometric data logging. The experimental setup is shown on the left. The right image shows the goniometer attached to the subject’s arm.

Figure 12.

Screenshot from Physiological recording system showing joint angle (goniometer), raw and RMS EMG signals from biceps and triceps muscles.

Representative EMG activity from the triceps muscle recordings performing the real and virtual peg-in-hole tasks are shown in Figure 13. Whilst these EMG graphs are only preliminary data, there would appear to be differences between the two sets of traces, both in terms of the overall patterns of the traces and also in the degree of noise in the traces. The nature of haptic devices are such that it contains a certain degree of dampening of any movements by the user.

Figure 13.

Raw Triceps EMG traces for a sequence of 6 representative peg-in-hole insertions and removals for each subject. Real task traces are shown on left and virtual task traces on the right.

Human motor control is achieved through a combination of feed-forward and feed-back systems in combination with a central comparator. It has been suggested that noise in a physiological sensory system can enhance the flow of information by helping to aid discrimination (Schaefer et al., 2006). Theses preliminary results suggest that there may be an under-lying physiological difference in the nature of the real and virtual world peg-in-hole tasks that warrants further investigation. The use of psychophysiological techniques and approaches similar to those described above will help better understand the nature of users’ experiences in VE and also provide insight into ways in which the environment may be augmented in order to provide more engaging experiences.

6.4. Effects of geometry

How much does a change in geometry affect assembly tasks when both visual and haptic information is available? It was decided to statistically investigate the cause and effect relationships between tasks by investigating those trials where only one parameter was changed. Due to the sample size a one-tail Student’s t-test (Pearson & Kendall, 1970) was chosen at a 1% level. Table 2 presents a comparison of TCT (less damping) for two geometry pairs namely, CPCH and FPFH, under the effects of stereovision and collision detection feedback. Real world (RW) results for the corresponding virtual task are also shown.

Exp. ID Geometry World Stereo C\D TCT (sec) Std. Dev. TCT
1 Chamfered Peg Chamfered Hole (CPCH) Virtual 1 0 4.838 2.853
2 0 1 3.192 1.864
3 1 1 1.73 1.481
4 Flat Peg Flat Hole (FPFH) 1 0 4.846 2.531
5 0 1 4.012 2.291
6 1 1 2.821 1.797
RW 1 CPCH Real 1 1 0.84 0.144
RW 4 FPFH 1 1 1.072 0.251

Table 2.

Comparison of geometry, stereovision and collision detection TCT (less damping). The highlighted rows indicate modal correspondence of virtual/real-world tasks.

The null hypothesis (H0) applied was: “The performance between task pairs will be the same.” The alternative hypothesis (H1) was: “The factor changed in the experiment affected the outcome.”

This test is designed for data that potentially alters under stereovision and/or collision feedback for CPCH or FPFH condition. The aim is to compare the amount of variability due to the predicted differences in scores between the two conditions as against the total variability in participants’ scores. The predicted differences are calculated as a difference between the mean scores for the two groups. This difference between the means has to be compared against the total variance in all scores. If there were only random differences between the scores in the two conditions the variance due to the predicted differences would be relatively small in relation to the total variability in scores. The t-test results are shown in Table 3.

Table 3.

T-test results (less damping) for the comparison between virtual and real TCT. CPCH – Chamfer on hole and peg. FPFH – no chamfers. Column 1 indicates the Pair number; Column 2 indicates the experimental pairing; Column 3 indicates the environment; Column 5 and 6 indicate whether stereovision and collision detection is in use – 1 (yes), 0 (no); Column 7 and 8 show each pairs’ individual task completion time respectively; Column 9 presents the t-test results for each pair.

Pair 3 compares how differing geometries affect assembly performance. A highly significant difference between the two populations (p<<0.01) indicates that chamfers do make a significant difference over TCT reduction. Further, it clearly shows the benefit of stereovision when coupled with collision detection. Comparison of the real world experiments (Table 3, Pair 7) indicates that behaviour in the real world was the same regardless of peg/hole type (p<<0.01). Considering Table 3 results we can see that even though the peg (and similarly, the hole) chamfers are almost imperceptible, they have a significant influence on TCT. This further justifies the work by Unger (Unger et al., 2001) who showed that haptic senses can discriminate between very fine forces and positions and that real and virtual world placements strategies are essentially similar.

Advertisement

7. Assembly chronocyclegraphs – towards real world applications

Unlike the majority of reported work on assessing and generating assembly plans in a restricted manner, the pump assembly experiment was designed to be carried out with randomly placed components, rather than components whose final position was already known. This free-form type of assembly exercise is much closer to real-word assembly applications and novel in its application to assembly planning generation. Further, participants were not shown the actual assembly and had no prior knowledge of how each component fitted. Essentially, this test was about capturing a participant’s perception and intent. The experiment was carried out in both the real and virtual environments to assess the haptic VR interface with a total of six participants.

The virtual and real components of a hydraulic gear pump are shown in Figure 14. It comprises a pair of bushings, housing and a set of cogs. Each component is loaded into the scene and placed randomly. Participants were then instructed to assemble the components in their own time. This experiment was not about task completion time; rather, the objective is to gather information and understand how a human deduces the sequence of assembly and how they arrange the parts to fulfil their intent assisted by haptic feedback. Figure 15 presents the chronocyclegraph results and associated therblig units of one such participant. The experiment was conducted with haptic feedback but without stereovision.

Figure 14.

Pump assembly. Virtual models on left, real on the right.

The MTL and therbligs (white and green spheres) showed in Figure 15(a) depicts how the participant is navigating in the workspace. Sparsely separated green spheres and the few patches of compact spheres indicate that the participant has quickly identified the assembly sequence of the components. The blue spheres in Figure 15(b) confirm the selection process through inspection (i.e. touching the object). From the results, it appears that during the assembly process of manipulation and insertion, participants were also preventing the object (the blue spheres directly above the highlighted cog in this example) from misalignment as it was being positioned. Figure 15(c) shows the displacement of the components during assembly. From observation, the grasping and manipulation of the components consumed the most time. The vortices in the MTL clearly indicate that each component had to be reoriented for successful assembly.

Figure 15.

Chronocyclegraph analysis in HAMMS. The results indicate this participant has good shape perception and probably some knowledge on the functionality of each component. The MTL and therbligs show: (a) decisive navigation and (b) selection of parts, (c) the majority of time was spent on manipulating parts for assembly.

Figure 16.

Identifying through haptic interaction, possible decision making from MTL and therbligs.

Further insights to the process of selection can be observed in the MTL. For example, abrupt changes in direction during the search (green spheres) operation and selection (blue spheres) indicate that perhaps the initial approach was not suitable. When the participant pauses there is little positional and/or velocity change. This is reflected in the MTL as tight squiggles in the profile and/or along with very tightly packed spheres. This evidence is particularly visible as the participant brings an object close to its assembly point (Figure 16). This form of output tantalizingly suggests that this approach can be used to detect manufacturing intent or confidence in decision making during the actual planning process; this will be further researched to see if there are ways in which decision-making processes and intent can be formalised automatically.

7.1. Generating assembly instructions

The logged data can be parsed to extract assembly instructions. Table 4 presents the assembly sequence of the pump component layout shown in Figure 14(a). The prognosis of the MTL and its associated therbligs through visual analysis is liable to subjective interpretation. In order to ascertain its validity, the extrapolated information given in Table 1 can be use to crosscheck against the MTL.

Table 4.

Pump assembly plan automatically generated by extracting logged data. The total virtual time for the virtual assembly operation is 89.1 seconds while the real world 23.7 seconds. The positions and orientations shown correspond to the assembled unit.

Figure 17 shows an overlay of assembly operations deduced from the logged data. This validity check is necessary in order to identify any discrepancies during the initial subjective interpretation of the MTL data. In this example, the bush associated with the assembly operation (Op Num 50) does not seem to be in the right place. Comparing to the bush’s location in Figure 14, the position of the bush when Op Num 50 begins is much farther away. The reason is that while manipulating the small cog (Op Num 40) there was a collision with the bush causing it to be displaced. Note that the position and orientation of each component in Table 4 correspond to the final assembled location.

Figure 17.

Assembly operation crosscheck.

As the experiment was designed without constraints or restrictions, participants were allowed to assemble the components in the manner they saw fit. Through observation and collected data, 90% of the assembly operations were sequenced in identical format as that described in Table 4. Only 2 participants assembled the small cog before the large cog. However, there was no change in timing trends with regards to aligning and inserting the cogs. The time required to fit the second cog once the first was install was always more (approximately 10 times) regardless of environment. The only notable difference was when 1 participant assembled the bushings and cogs first before slipping the housing over them. While the times recorded were much less for the cog/bush assembly, the participant spent the majority of time (40 seconds real world; 65 seconds virtual world) locating and aligning the housing such that it could be slipped into position.

Advertisement

8. Discussion

The overall objective of this work was to investigate the impact of a haptic VR environment on the user, its effectiveness and productivity for real engineering applications. In this context, the following observations support several important conclusions.

The experiments conducted have demonstrated that small shape change can affect assembly times in haptic VR environments; this is especially significant because the participants were unaware of any component shape changes. They have also shown that, in the case of chamfered features and flat features, the same relative reduction in TCT was recorded as the virtual technology used moves from stereo/no collision detection to stereo/full collision detection. In fact, with full stereo/haptics the best two computer-based performances were recorded for both chamfered and flat features.

The effect of chamfers can clearly be seen when compared against the non-chamfered results presented in Table 2. It can be seen that although the absolute assembly time in the stereo/haptic environment is significantly greater than that of the real world task, the relative difference between chamfered and flat peg insertion times, 61%, compare with published data surprisingly well (i.e. 57% as reported by Haeusler (Haeusler, 1981)).

The benefits of stereovision in virtual assembly environments are highlighted in Table 3 (Pair 4). In contrast to the real world, scalability is not an issue in virtual environments and subtle design alterations, even at micro level, can be simulated when augmented with haptic feedback.

The timings in Table 4 offers an important and interesting observation in that the virtual time gives the planning time when compared to actual planning experiments conducted in previous research (Sung et al, 2009).

The peg-in-hole tests have also highlighted several areas of the HAMMS system that needs to be improved. One such area is the damping effect caused by integrating various virtual engines. More efficient memory management and thread synchronization will be necessary to provide users with a better experience.

This work has also successfully used a haptic free-form assembly environment with which to generate assembly plans, associated times, chronocyclegraphs and therblig information. Also, it has been shown that by analyzing the chronocyclegraphs and interpreting user movements and interactions there is considerable potential for analyzing manufacturing methods and formalizing associated decision-making processes. Understanding and extracting the cognitive aspects in relation to particular tasks is not trivial. In the HAMMS environment, it requires dissecting the elements associated to human perception both in terms of visual cues and kinesthesia. It is envisaged that by logging user motion in the manner shown and outputting an interaction pattern over a task, the derived chronocyclegraph can be used to pinpoint areas of where and how decisions are made. HAMMS, as a test bed for investigating human factors, is still in its infancy and it is accepted that some areas, such as data collection methods and its visualization, can be improved. However, this early work indicated its potential as being much wider than simply validating assembly processes. The provision of auditory cues could also both further enhance a user’s experience and provide clues on how the human sensory system synchronizes and process sound inputs with tacit and visual signals.

The assembly planning and knowledge capture mechanism presented here is simple and easily embedded in specific engineering processes, especially those that routinely handle important technical task, risk and safety issues. It is important to acquire engineering knowledge as it occurs while preserving the original format and intent. Collecting information in this manner is a more cost effective and robust approach than trying to create new documentation, or capture surviving documents years after key personnel have left the programme. The potential for this has been amply demonstrated in this work.

Advertisement

9. Conclusions

The subjective data on HAMMS system performance indicates that the intuitive nature of haptic VR for product interaction, which combine more than one of the senses in an engineering experience, bodes well for the future development of virtual engineering systems. Therefore, it can be concluded that emerging haptic technologies will be likely to result in the creation of natural and intuitive computer-based product engineering tools that allow a tactile experience through a combination of vision and touch.

The initiative to undertake preliminary investigation in order to assess the physiological response during both real world and virtual reality versions of assembly tasks is novel and has until now never been researched.

While haptic-VR technologies are beginning to find its way into mainstream industrial applications (Dominjon et al., 2007), from a usability and engagement standpoint there are still a number of issues to be addressed. Therefore the concept of employing a game-based approach is already being proposed as a way forwards to enhance engineering application (Louchart et al., 2009). Studies have shown that in a more relaxing game-like environment, users’ strong desire to accomplish something produce better results. The nature of game playing is defined by the users’ actions to reach an explicit goal, where one failure can provide the basis for a new attempt, or succeed and give acknowledgments and metrics of how well one has done. The goals, feedback and the mixture of failure and achievement provide a state of “flow” which encourages the process of learning (Björk, 2009). In healthcare there are many game-based rehabilitation applications (Dreifaldt & Lövquist, 2006) as well as surgical simulation training (Chan et al., 2009) to make the related process more rewarding, engaging and fun. There are a range of possibilities offered by gaming technologies. We believe that engineering application design can benefit from exploiting game-based approaches.

Haptics closes the gap in our current computer interfaces and has the potential to open up new possibilities. For engineers, blending haptics with recent advances such as in gaming, robotics and computer-numerical machine tools allows training for intricate procedures virtually, with increasingly accurate sensory feedback.

References

  1. 1. Adams R. J. Klowden D. Hannaford B. 2001 Virtual Training for a Manual Assembly Task. Haptics-e, 2 2 1 7 . (http://www.haptics-e.org)
  2. 2. AGEIA PhysX 2008 Acquired by NVIDIA Corporation in 2008. Available: http://www.nvidia.com/object/physx_new.html.
  3. 3. Amirabdollahian F. Gomes G. T. Johnson G. R. 2005 The Peg-in-Hole: A VR-Based Haptic Assessment for Quantifying Upper Limb Performance and Skills. Proc. of the 9th IEEE Int’l Conf. On Rehabilitation Robotics, 422 425 .
  4. 4. Bashir A. B. Bicker R. Taylor P. M. 2004 An Investigation into Different Visual/Tactual Feedback Modes for a Virtual Object Manipulation Task. In: Proc. of the ACM SIGGRAPH Int’l Conf. on Virtual Reality Continuum and its Applications in Industry, 359 362 .
  5. 5. Bakker N. H. Werkhoven P. J. Passenier P. O. 1993 The effects of proprioception and visual feedback on geographical orientation in virtual environments. Presence: Teleoperators and Virtual Environments, 8 36 53 .
  6. 6. Bayazit O. B. Song G. Amato N. M. 2000 Enhancing Randomised Motion Planners: Exploring with Haptic Hints. Proc. 2000 IEEE Int’l Conf. On Robotics & Automation, San Francisco, 529 536 .
  7. 7. Beal A. C. Loomis J. M. 1995 Absolute motion parallax weakly determines visual scale in real and virtual environments. Proc. SPIE, Bellingham, WA, 2411 288 297 .
  8. 8. Björk S. 2009 Gameplay Design as Didactic Design. 40th Annual Conference of International Simulation and Gaming Association, Singapore 2009.
  9. 9. Boothroyd G. Dewhurst P. Knight W. 2002 Product Design for Manufacture and Assembly. 2nd Edition. 0-8247-0584-X.
  10. 10. Bresciani J. P. Drewing P. Ernst M. O. 2008 Human Haptic Perception and the Design of Haptic Enhanced Virtual Environments. Springer Tracts in Advanced Robotics volm 45, 61 106 .
  11. 11. Brooks F. P. Jr. 1992 Walkthrough project: Final technical report to National Science Foundation Computer and Information Science and Engineering, Dept. Computer Science, Univ. North Carolina-Chapel Hill, TR92-026.
  12. 12. Burdea G. C. 1996 Force and Touch Feedback for Virtual Reality. Wiley Interscience, New York. 100471021415
  13. 13. Chan W. Y. Ni D. Pang W. M. Qin J. Chui Y. P. Yu S. C. H. Heng P. A. 2009 Make It Fun: an Educational game for Ultrasound Guided Needle Insertion Training. 40th Annual Conference of International Simulation and Gaming Association, Singapore 2009.
  14. 14. Coutee A. S. McDermott S. D. Bras B. 2001 A Haptic Assembly and Disassembly Simulation Environment and Associated Computational Load Optimization Techniques. JOURNAL of Computing and Information Science and Engineering, 1 113 122 .
  15. 15. Derrington A. M. Allen H. A. Delicato L. S. 2004 Visual mechanisms of motion analysis and motion perception, Annu. Rev. Psychol., 55 181 205 .
  16. 16. Lövquist E. Dreifaldt U. 2006 The design of a haptic exercise for post-stroke arm rehabilitation. Proc. 6th Intl Conf. Disability, Virtual Reality & Assoc. Tech., Esbjerg, Denmark, 2006.
  17. 17. Dominjon L. Perret J. Lecuyer A. 2007 Novel devices and interaction techniques for human scale haptics, Springer-Verlag, 257 266 .
  18. 18. Ferrieira A. Mavroidis C. 2006 Virtual Reality and Haptics for Nano Robotics: A Review Study. IEEE Robotics and Automation Magazine, 13 2 78 92 .
  19. 19. Fitts P. M. 1954 The information capacity of human motor systems in controlling the amplitude of a movement. Journal of Experimental Psychology, 47 381-391.
  20. 20. Fritschi M. Esen H. Buss M. Ernst M. 2008 Multi-modal VR Systems scale Haptics. Springer Tracts in Advanced Robotics, 45 179 206 .
  21. 21. Gerovichev O. Marayong P. Okamura A. M. 2002 The effect of Visual and Haptic Feedback on Manual and Teleoperated Needle Insertion. Proc. of the 5th Int’l Conf. on Medical Image Computing and Computer-Assisted Intervention-Part I, 2488 147 154 .
  22. 22. Gupta R. Whitney D. Zeltzer D. 1997 Prototyping and Design for Assembly analysis using Multimodal virtual environments. CAD, 29 8 585 597 .
  23. 23. Haeusler J. 1981 Design for Assembly- State-of-the-art. Proc. of the 2nd Int’l Conf. on Assembly Automation, Brighton, 109 128 , 0-90360-816-2
  24. 24. Ho C. Boothroyd G. 1979 Design of chamfers for ease of assembly. Proc. of the 7th Manuf Eng Trans, North AME Metalwork Res. Conf., 345 354 .
  25. 25. Iglesias R. Casado S. Gutierrez T. Garcia-Alonso A. Yap K. M. Yu W. Marshall A. 2006 A Peer-to-peer Architecture for Collaborative Haptic Assembly. Proc. of 10th IEEE Int’l Sym. On Distributed Simulation and Real-Time Applications (DS-RT’06), 25 34 .
  26. 26. Immersion Corporation 2008 801 Fox Lane, San Jose, California 95131 USA. (http://www.immersion.com/)
  27. 27. Johnson S. Ogilvie G. 1972 Work Analysis. The Butterworth Group, London.
  28. 28. Kocherry J. Srimathveeravalli G. Chowriappa A. J. Kesavadas T. Shin G. 2009 Improving Haptic Experience through Biomechanical Measurements. 3rd Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, USA, March, 2009, 362 367 .
  29. 29. Lim T. Dewar R. Calis M. Ritchie J. M. Corney J. R. Desmulliez M. 2006 A Structural Assessment of Haptic-based Assembly Processes. 1st International Virtual Manufacturing Workshop (VirMan’06), 26th March, Virginia, USA, 29.
  30. 30. Linden Lab 1999 Second Life® virtual world (http://secondlife.com/?v=1)
  31. 31. Louchart S. Lim T. Al-Sulaiman H. M. 2009 Why are video-games relevant test-beds for studying interactivity for Engineers? 40th Annual Conference of International Simulation and Gaming Association, Singapore 2009.
  32. 32. MacNaughton. Inc. 2008 1815 NW 169th Place, Suite 3060, Beaverton, OR 97006, USA. http://www.nuvision3d.com
  33. 33. Massie T. Salisbury K. 1994 The PHANTom Haptic Interface: A Device for probing Virtual Objects. ASME Winter Annual Meeting, DSC-55-1 , 295 300 .
  34. 34. Pearson E. S. Kendall M. G. 1970 Gosset, William Sealy 1876-1937, Studies in the History of Statistics and Probability, Charles Griffin and Co., Volume I, 355 404 .
  35. 35. Price B. 1990 Frank and Lillian Gilbreth and the Motion Study Controversy, 1907-1930. In: A Mental Revolution: Scientific Management since Taylor, Daniel Nelson, ed. The Ohio State University Press.
  36. 36. Ritchie J. M. Dewar R. G. Robinson G. Simmons J. E. L. Ng F. M. 2006 The Role of Non-intrusive Operator Logging to Support the Analysis and Generation of Product Data using Immersive VR. Journal of Virtual and Physical Prototyping, 1 n2, 117 134 .
  37. 37. Robinson G. Ritchie J. M. Day P. N. Dewar R. G. 2007 System design and user evaluation of CoStar: an immersive stereoscopic system for cable harness design, Computer-Aided Design, 39, 245 257 .
  38. 38. Rosenberg L. B. 1994 Virtual haptic overlays enhance performance in telerpresence tasks. Proc. SPIE Telemanipulator and Telepresence Technologies Symposium, 99 108 , Boston, October 31.
  39. 39. Salisbury K. Brock D. Massie T. Swarup N. . Zilles C. 1995 Haptic rendering: programming touch interaction with virtual objects. In Proc. of the 1995 Symposium on interactive 3D Graphics, Monterey, California, United States, April 09-12.
  40. 40. Schaefer A. T. Angelo K. Spors H. Margrie T. W. 2006 Neuronal oscillations enhance stimulus discrimination by ensuring action potential precision. PLoS Biol., 2006 Jun;4(6):e163.
  41. 41. SensAble Technologies 1993 Inc. 15 Constitution Way Woburn, MA 01801. (http://www.sensable.com/).
  42. 42. Seth A. Su H-J. Vance J. 2005 A desktop networked haptic VR interface for mechanical assembly. Proc. of IMECE’05 ASME Int’l Mech. Eng. Congress and Exposition, 1 8 , Nov. 5-11, Orlando, Florida.
  43. 43. Sung R. C. W. Ritchie J. M. Lim T. Medellin H. World Conference on Innovative 2009 2009, WINVR09, February 25-26, 2009, Chalon-sur-Saone, France, Paper 713, 978-0-79183-841-9
  44. 44. Thin A. G. Hansen L. McEachen D. Flow Experience and Mood States whilst Playing Body-Movement Controlled Video Games. Experience in Body-Movement Controlled Video Games. Manuscript under review.
  45. 45. Ueberle M. Mock N. Buss M. 2004 VISHARD10, a Novel Hyper-Redundant Haptic Interface. Proc. of the 12th Int’l Sym. on Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS’04), 27-28 March 2004, 58 65
  46. 46. Unger B. J. Nicoladis A. Berkelman P. J. Thompson A. Klatzky R. L. Hollis R. L. 2001 Comparison of 3-D Haptic Peg-in-Hole Tasks in Real and Virtual Environments. Proc. of the IEEE/RSJ Int’l Conf. On Intelligent Robots and Systems, 1751 1756 .
  47. 47. VTK, The Visualization ToolKit. 1998 Kitware, Inc., 28 Corporate Drive, Suite 204, Clifton Park, New York 12065, USA. Available: http://www.kitware.com.
  48. 48. Yoshikawa T. Kawai M. Yoshimoto K. 2003 Toward Observation of Human Assembly Skill Using Virtual Task Space. Experimental Robotics VIII, STAR 5, 540 549 .

Written By

T. Lim, J.M. Ritchie, R. Sung, Z. Kosmadoudi, Y. Liu and A.G. Thin

Published: 01 April 2010