Open access

Virtual and Augmented Reality: A New Approach to Aid Users of Myoelectric Prostheses

Written By

Alcimar Barbosa Soares, Edgard Afonso Lamounier Júnior, Adriano de Oliveira Andrade and Alexandre Cardoso

Submitted: 16 February 2012 Published: 17 October 2012

DOI: 10.5772/50600

Chapter metrics overview

3,276 Chapter Downloads

View Full Metrics

1. Introduction

During the past decades, great effort has been devoted to devise new strategies for the control of artificial limbs fitted to patients with congenital defects or who have lost their limbs in accidents or surgery [1-6]. Most of that work was dedicated to minimize the great mental effort needed to control the prosthetic limb, especially during the first stages of training. When working with myoelectric prosthesis, that effort increases dramatically. These devices use EMG signals (the electrical manifestation of the neuromuscular activation associated with a contracting muscle) collected from remnant muscles to generate control inputs for the artificial limb. As these devices use a biological signal to control their movements, it is expected that they should be much easier to control. However, the prosthesis control is very unnatural and requires a great mental effort, especially during the first months after fitting [2, 7, 8]. As a result, a number of patients give up the use of those devices very soon. To overcome those problems, different techniques have been tried as an attempt to devise better strategies for myoelectric control.

This chapter describes the advent of Virtual Reality (VR) systems to create training environments dedicated to users of prosthetic devices. Those VR systems generally simulate a prosthesis that can react to commands issued by the users. A sophisticated system proposed by the authors is also described. Known as “The Virtual Myoelectric Prosthesis”, the system is based on the use of EMG to control a virtual prosthesis in an Augmented Reality (AR) environment, in real time, providing the user with a more natural and intuitive training environment. The overall aim is to reproduce the operation of a real prosthesis, in an immersive AR environment, using a virtual device that operates in similar fashion to the real one. Also, the research team believes that, since real upper limb prostheses are relatively heavy and can become uncomfortable and cumbersome, especially during the first stages of fitting, the use of a virtually weightless and fully controllable device can help reducing the great physical and metal effort usually necessary, especially in the first trials.

Advertisement

2. Myoelectric prostheses

The human body has been considered a perfect machine, in which all parts work in harmony one with each other. Most of us can control this “machine“ without much effort, until some disturbance caused by disease or injury results in loss of some of its functionality.

The absence of limbs caused by trauma or congenital disorders, can affect our lives profoundly. Simple tasks such as walking or dressing can become extremely difficult or even impossible to execute. There is no doubt that the best solution for the loss of a limb might be the development of some kind of genetic manipulation that stimulates the regeneration of tissue. However, while this is not possible, the best we can do is to restore some of the lost functionality by means of artificial limbs.

For centuries, mankind seek ways to replace lost limbs by mechanical devices. Several ancient designs can be found in museums and libraries. However, the first artifact to be formally named artificial limb was a Roman prosthesis, made of wood and bronze, which appeared around 300 BC [9]. During the Middle Ages, while the poor wore “wooden legs", which were simple, inexpensive and stable, the rich nobles used devices made of iron, which were more decorative than functional. In 1818 Peter Ballif designed the first prosthesis actuated by movements of healthy parts of the body. Before this, the upper limb prostheses were heavy and depended on an able hand for operation [10]. Thereafter, a number of experiments have been carried out seeking the “perfect prosthesis“, a device that could be similar to what Wolfe had visualized in 1952 in his book “New York: Random House“ [11]:

“They had perfected an artificial limb superior in many ways to the real thing, integrated into the nerves and muscles of the stump, powered by a built-in atomic energy plant, equipped with sensory as well as motor functions”

As we know, to date, this prediction has yet to be completely materialized, but much has been done since then. Due to the great number of casualties of World War II, the government of the United States created in 1945 a program of research and development from which scientists and engineers were deeply involved in projects aimed at the development of artificial limbs for amputees [10].

Another fact that lead to the acceleration of the researches in the area, was the large number of congenital defects caused by the use of a drug called Thalidomide. As describe by Soares [10], it was synthesized by the German laboratory Chemie Grünenthal in 1957 and marketed worldwide between 1958 and 1962. This drug was prescribed to minimize sickness during pregnancy. The Thalidomide consumers were not warned that the drug could exceed the placenta affecting the fetus. This oversight had a catastrophic effect: drug abuse, especially during the first trimester of pregnancy, has killed thousands of babies. Those who survived experienced birth defects such as deafness, blindness, disfigurement, and especially the shortening or absence of members. Responding to this tragedy, several research centers intensified the efforts towards the design of artificial limbs as an attempt to improve the lives of those children.

Also, during that period, russian scientists had introduced a prosthetic hand controlled by a signal generated by the activity of remaining muscles from amputated limbs [8]. That type of control has been described as “myoelectric control” and the prosthesis, by extension, has been described as “myoelectric prosthesis“.

2.1. Controlling strategies

The control of prostheses can be considered one of the most interesting challenges related to prosthetics. Ideally, a prosthetic limb should be controlled without any effort from the user, similar to the subconscious control of a natural limb.

Currently, there are two main strategies for controlling artificial limbs: biomechanical and bioelectrical. In the first, the motion of parts of the body results in the activation of the limb, whereas, in the latter, biosignals, generated from the electrical activity of muscles, are detected and interpreted in order to generate commands for controlling the prosthesis. Nevertheless, there is ongoing research seeking other forms of control based on more natural strategies, such as those that employ brain or neuronal activity together with sensory feedback [5-7, 12-14].

As described earlier, the first prostheses were generally passive devices that relied on intact parts of the body for their positioning and controlling. This extremely successful design allowed the user to control the device so that the movement of a part of the body was reflected in movements of parts of the prosthesis. Despite some modifications, this design remains basically the same nowadays, being the most popular control mechanism among users [10]. The reasons for such success are not well established, but according to Doeringer and Hogan [15] some of the key factors are: it results in a relatively inexpensive prosthesis; the final prosthesis is not too heavy; after training, the user begins to use the prosthesis as a natural extension of his body, having, for example, the notion of weight and size of the prosthetic limb. Kruit and Cool [16] described the main drawbacks: the mechanism of harnesses used to propagate the movements of the body is usually uncomfortable; the movement of the prosthesis requires significantly large forces; the number of control inputs is limited and thus the number of degrees of freedom of the prosthesis is also limited.

An alternative to the Body-Powered control is to employ the myoelectric control, which uses the electrical activity of muscle contraction (electromyographic signal) as a primary source of control. The prostheses that use this type of control typically do not require cables and, in some situations, there is no need for suspension straps. The operation of a myoelectric device can be summarized as follows: the brain sends commands, i.e., neuronal impulses that travel through nerves and reach the endplate of a given muscle, which, in turn, causes muscle contraction; The electrical muscle activity is then captured by electrodes (normally in a socket attached to the stump), interpreted by customized programs in a microcontroller, and used to activate the actuator of the prosthesis.

Many myoelectric prostheses employed a type of control called “two-site two-states”, from which a pair of electrodes is placed on two distinct muscles. The contraction of one of these muscles produces the opening of the hand. The antagonist muscle is used in the same way to control the closing of the hand. As pointed out by Scott and Parker [8], this approach works in a manner analogous to the human body, i.e., two antagonistic muscles (or group of muscles) control the movement of a joint. However, as patients must learn to generate independent contractions of the muscles, which requires a high degree of concentration, the training can be lengthy and demand a lot of mental effort. There are also some situations in which it is not possible to find two available groups of muscles, or there might be more than one joint to be controlled. For these situations other controlling approaches have been developed [17]. For instance, the “one-site three-states", from which a little contraction of a muscle produces the closing of the hand, a strong contraction open it, and the lack muscle activity stops the hand. Figure 1 shows an example of a hand prosthesis controlled by electromyographic signals captured by four pairs of dome electrodes, distributed around the residual limb ([18]).

Figure 1.

An experimental setup for a myoelectric prosthesis, developed at the University of New Bruswick, Canada (extracted from [18]).

Currently there are a number of methods using proportional control based on the electrical activity of muscles to control the speed, torque and position of prosthetic joints. However, due to the nature of the myoelectric signal, errors and inaccuracies may occur [19]. Myoelectric signals can be detected using basically two types of electrodes: surface electrodes positioned on the skin surface, and needle electrodes inserted in relevant positions of the muscle. In both cases the electrodes produce a difference of potential relative to a reference (typically another electrode located elsewhere on the body). This voltage is the result of an asynchronous activation of hundreds of muscle fibers. The signal is similar to a random noise whose amplitude is modulated by a voluntary input. Its shape depends on variables such as strength and speed of activation, positioning and types of electrodes used in its detection, electronic circuits used for acquisition, amplification and processing [20]. These factors make the translation of myoelectric activity into commands for a prosthetic limb a challenge. Moreover, the generation of myoelectric patterns must be learned by the user, and this is a task which requires concentration, regular training and a great amount of physical effort.

A common way of training an individual to generate myoelectric patterns is by using feedback software, which provides the user with visual feedback about the relation between the forces associated to the contraction with the amplitude of the generated myoelectric signal. The main drawback of this strategy is that it does not give the user feedback information or sense of proprioception. Recent studies have suggested that virtual and augmented reality [21] may be a relevant tool to address the limitations of conventional training techniques. The main advantage of this technique is that it can simulate the physical presence of the artificial limb in the real world, as well as in imaginary worlds. Moreover, some simulations may include additional sensory information, such as sound through speakers or headphones. It is also possible to include tactile information, generally known as force feedback, for the individual.

Advertisement

3. Virtual and augmented reality

Virtual Reality (VR) can be defined as an advanced computer interface where the user can, in real time, navigate within a tridimensional environment interacting with its virtual objects. Such interaction is achieved in a very intuitive and natural way. To do so, multisensory devices are supplied [22].

In order to illustrate this concept, consider Figure 2, in which a user is shown standing inside a research laboratory. However, since she is equipped with multisensory devices (Head Mounted Display – HMD and hand (glove) sensors), a computer-based system provides her the feeling of being steeped into a different environment.

The system, known as BioSimMER© (from Sandia National Laboratories) [23], is used to train rescue personnel to respond to terrorist attacks. The screen on the top shows the working environment displayed only for the eyes of the health professional and the virtual patient exhibits realistic symptoms. Such facilities are not supported by traditional computer interfaces.

Therefore, to achieve the high level of natural interface required by VR systems, it is important to provide the user with the feeling of immersion and the ability for interaction. To reach such requirements, VR developers must guarantee: 1) Real life 3D object images from the user´s perspective; 2) The aptitude to track the user's motions, particularly his head and eye movements, and correspondingly adjust the images on the output device to reflect the change in perspective.

Figure 2.

Experiment with virtual reality techniques and devices (extracted from [23]).

Augmented Reality (AR) is a technique that allows the integration of virtual objects within a real physical environment. Interaction is again supported by multisensory equipment. Essentially, a real scene, captured by a digital camera, is “augmented” with the insertion of virtual objects [24]. Figure 3 illustrates this concept.

Figure 3.

An example of Augmented Reality extracted from [25]: (a) Positioning a fiduciary marker in mechanical part; (b) Heating distribution visualization through user´s glasses.

In Figure 3(a), an engineer uses a fiduciary square marker to identify the heating distribution throughout a mechanical part, as shown in Figure 3(b). However, this “virtual” information can only be seen by him, through the equipment and glasses he carries.

A very well-known framework to support AR is the ARToolKit ([26]). It provides Computer Vision techniques to calculate position and orientation of a digital camera in relation to fiduciary markers. The augmentation is produced after a series of transformations, as shown in Figure 4. First, the real video image is transformed into a binary one. Then, this image is processed in order to determine square black regions (fiduciary markers – regions whose outline contour can be fitted by four line segments) containing an image pattern that is compared to patterns stored in a database. Next, the algorithm uses size-known squares and the pattern orientation as the base for the coordinates frame and to calculate the real position of the digital camera in relation to the physical marker. After that, the 3D objects are placed over the fiduciary markers, and the final image is sent to the display.

Figure 4.

ARToolKit workflow.

Nowadays, VR and AR systems have been intensively used for training and simulation. According to [27] and [28], the main reasons for that are:

  • It provides “Learning by doing” - according to pedagogical studies, the learning curve and the amount of knowledge acquired are intensified when the apprentice plays an active role during the process;

  • It supports virtual and interactive experiments/simulations - replacing physical counterparts that could pose health hazard or even be too expensive in real life;

  • It allows training to be executed outside classes and clinics; and

  • It inspires creativity.

The strategies adopted by systems like the ARToolKit are promising and relatively simple to be incorporated into final applications. However, the need for physical markers can limit their application in many areas. Hence, an interface that is able to represent the real environment, capture movements and sounds and transforming them into actions to interact with virtual objects, have been the focus of many recent researches seeking a human-computer dialogue closer to natural. That’s why the expression “natural user interface” has emerged as new computer interaction technology. It focuses on human abilities such as touch, vision, voice and higher cognitive functions, such as expression, perception and recall [29]. The main objective here is to give a physical meaning for the digital information. In so doing, data manipulation with the use of bare hands, gestures, voice commands and pattern recognition are supported.

Recently, Santos et al. [30] presented an application that uses gestures to interact with virtual objects in an Augmented Reality, based on Kinect© (a motion sensing input device developed by Microsoft Inc.). The application is not limited by environment, lighting or skin color and doesn´t require fiduciary markers. The system allows the user to perform operations on menus and interact with virtual objects solely by hand gestures (Figure 5).

Figure 5.

Natural User Interface with Augmented Reality: (a) User selecting menu options; and (b) User manipulating a virtual object (extracted from [30]).

In recent past years, it has been observed a steady growth in the use of Virtual and Augmented Reality in health care [31, 32]. There a number of examples in the literature. However, to ilustrate the technique, let´s take two examples into account.

Payandeh and Shi [33] presented a mechanics-based interactive multi-modal environment designed to teach basic suturing and knotting techniques for simple skin or soft tissue wound closure. Two haptic devices are used for force-feedback, simulating the experience of suturing a real tissue (Figure 6).

That realist feeling was provided by a number computer-based techniques, such as mass-spring system, to simulate the deformable tissue (skin), mechanics-based techniques to simulate the deformations of a linear object (the suturing material) and collision detection for the interactions between the soft tissue and the needle. Figure 7 shows a pre-wound scene (a) and the result after the virtual suture (b).

Virtual and Augmented Reality have also been studied as a tool for phobia treatment [34]. As an example, consider the system describe in [35]. The project, which is accompanied by a psychologist, aims to design a system to gradually confrontate the patient with his object phobia. Clinical studies have shown that some patients cannot handle, or even do not evolve in the treatment, if exposed to a real arachnid in the initial sessions. Thus, AR is used to present the patient with virtual objects that reminds a spider (like a cartoon) and this object, gradually, becomes a very ‘realistic’ virtual one (with photorealism modeling techniques). Figure 8(a) shows a potential user wearing the system apparatus and Figure 8(b) shows the image seen by the user.

Figure 6.

VR multi-modal experimental setup for simulating surgical sutures (extracted from [33]).

Figure 7.

a) Virtual pre-wound tissue for suturing; (b) Virtual wound closed (extracted from [33]).

Figure 8.

a) User wearing HMD for arachnophobia treatment; (b) The AR image, as seen by the user.

Based on the discussions above, we can infer that VR and AR incorporate a number of features with great potential to overcome some of the difficulties associated with the training of prosthetic users.

Advertisement

4. Virtual prostheses

As described earlier, great effort has been devoted to devise new strategies for the control of artificial limbs fitted to patients with congenital defects or who have lost their limbs in accidents or surgery. In general, the devices do not properly resemble the real counterpart, do not react in the same manner, do not provide proper feedback and cannot be controlled using the “natural” interfaces, i.e., signals emanated from the central nervous system. Therefore, a number of difficulties arise when a new user tries to control an artificial limb, since he/she will have to devise a completely new strategy to generate input signals for the prostheses, so that it will act according to his/her wishes. This leads to a lengthy, tiring, and sometimes frustrating, training period. That is true for the great majority of the strategies for prosthesis control that have been designed to date.

Recently, a number of research groups turned their attention to VR and AR, in an attempt to overcome some of those problems. Although many works can be found in the literature, we have chosen just a few to illustrate the concept.

Pons et al. [36] describe the use of VR to support the training process for a multifunctional myoelectric hand prosthesis (MANUS) capable of generating up to four grasping modes (cylindrical, precision, lateral and hook grasps, in addition to wrist pronation-supination).

Figure 9.

One of the MANUS users performing a combined cylindrical grasp and wrist rotation (extracted from [36]).

As expected, multifunctional prostheses pose an additional problem for users: the more dexterous the device, the higher the number of command channels required to control it. As a result, a large number of different EMG commands, generally obtained by extra EMG channels, are required for successful management of the prosthesis. To minimize the number of channels, the authors proposed a three-bit ternary EMG command strategy. The users were asked to produce EMG bursts (by sudden contraction of a single muscle) and, if proper EMG thresholds could be defined, each burst was classified in three different levels. Each of those three levels were then given the digital values “0”, “1” or “2” (no signal, low, high), corresponding to one ternary bit. In so doing, if the user generates three bits, he/she could generate up to 27 different combinations (commands) from a single muscle. However, since the commands starting with “0” (i.e., “0XY”) were not valid, the three-bit ternary strategy allowed the generation of 18 effective commands. This means that, from a single muscle, the user could control up to 18 different functions/actions of the prosthesis. However, that is no easy task to learn. Hence, a special training device, based on VR simulation of the multifunctional prosthesis, was created to enable the learning of that “EMG command language”. Only after the training process was finished, the prosthesis was fitted and real manipulative operation started. The authors report that all of the volunteers were able to successfully perform basic commands after about 45 minutes.

In similar fashion, Resnik et al. [37] show the use of VR as an aiding tool for training users of advanced upper-limb prostheses. The device known as DEKA Arm (DEKA Research & Development Corporation) allows users 10 powered degrees of movement (Figure 10a). A VR environment program (Figure 10b) was created to allow users to practice controlling an avatar, using the controls designed to operate the DEKA Arm in the real world.

Figure 10.

a) DEKA Arm displayed on manikin; (b) VR avatar (extracted from [37]).

The authors report that the VR environment allows a gradual acclimatization to the arm, as the experience with the arm-control scheme prior to use of the physical arm allows a staged introduction of the new elements of the system. However, the system did not allow for interaction with virtual objects, i.e., it was not possible, for instance, the manipulation of an object with the virtual hand. Nevertheless, the system proved to be an important asset for upper-limb users who must master a large number of controls and for those who need a structured learning environment, due to cognitive deficits.

4.1. The virtual myoelectric prosthesis

Although VR has been extensively used as an aiding tool for users of prosthetic devices, the interaction with the virtual world still needs to be improved, in order to provide a real immersive training environment. To do so, the research group headed by the authors, have developed new techniques for VR interaction and for detection and processing of EMG signals, in order to extract the correct commands issued by the user which, in turn, could be used to control the movements of a device in a VR environment [21, 28, 30, 38]. However, although a purely non-immersive VR environment showed some good results, it was thought that an Augmented Reality (AR) environment would provide a more realistic experience. Hence, an AR environment was designed so that images of the virtual device (the prosthesis) are combined with images of the real world, providing a realistic environment for training upper limb prosthetic users [39]. A simplified block diagram of the system is shown in Figure 11.

Figure 11.

The authors’ approach for a Virtual Myoelectric Prosthesis (extracted from [39]).

In the system, the user is fitted with a head mounted device that includes a camera, for capturing the real images of the user´s view point, and a display to show the mixed images (augmented: real and virtual). The EMG signals are collected and processed to generate inputs to the VR unit [21]. A processing center decides when to update both the virtual arm (Figure 12) and the augmented reality image, to further send them to the graphics user interface (the head mounted display).

Figure 12.

A Virtual Prosthesis designed by the authors.

During operation, the camera captures the image and locates a marker at the user´s shoulder. The algorithm then searches for a virtual object that corresponds to such marker (Figure 13) and inserts it into the real world, captured by a camera.

Figure 13.

Image of a virtual object combined with the real world scene – an outsider point of view (extracted from [39]).

As described in [39], the control inputs for the AR environment are generated by EMG signals, collected from remnant muscles. The raw EMG signal, detected by surface electrodes is conditioned and processed to find out which movement the user wants to perform. To do so, the areas of activity in the EMG data were detected (windowing) and the resulting signal was then processed to generate a set of features used by an artificial neural network classifier. Basically, each EMG contraction was represented by a set of Auto-Regressive (AR) coefficients calculated according to a modified algorithm described in [38]. According to the authors, the choice of a neural network as a classifier was due to its ability to learn and later recognize signals as being part of the same class of movement in real time. Also, depending on the level of amputation, different users may generate different levels of contractions of the remaining part of the limb, for the same class movement. Besides, even if a single user performs only isometric or isotonic contractions, there will not be two identical contractions for the same movement. The neural network was trained with four classes of movements (elbow flexion, elbow extension, wrist pronation and wrist supination). The results showed a near perfect performance of the classifier (95% to 100% rate of success). The output of the neural network is then used as control input (position and motion) to the virtual device, which can be rendered and mixed with the real world scene, as shown in Figure 14.

Figure 14.

User´s point of view within the AR environment (extracted from [39]).

Note that this system allows the user to interact within the virtual environment - the virtual myoelectric prosthesis can touch and grab other virtual objects embedded in the real scene (such as the cube and the kettle in Figure 14). Also, a strong cognitive feedback is provided by this real time mixture of virtual objects with the real environment, given the feeling that it is almost possible to touch the virtual arm with the real one, and vice-versa.

Advertisement

5. Conclusion

This chapter presented an overview on the search of human beings for artificial devices capable of restoring, if not all, at least part of the functionality lost when we are affected by diseases, congenital disorders or trauma that results in the loss of a limb. Focusing on upper limb prosthesis, a series of sophisticated technical solutions have been proposed during the past decade to design devices whose behavior and control approach that of their healthy natural counterparts. However, as described along this chapter, operating a highly complex artificial limb is not a simple task. This is especially true for myoelectric multifunctional prostheses with many degrees of freedom. Since the necessary control commands, in most instances, can be very different from the “natural” commands, learning how to produce them is extremely difficult and time consuming. With the advent of Virtual and Augmented Reality, those technologies have been proposed as relevant tools to address some of the limitations of conventional training techniques. It is possible to design a virtual device very similar, in shape and behavior, to a real one. Also, it is even possible to collect commands from the real world (EMG signals generated by remnant muscles) and use them as inputs to control the actions of a virtual prosthesis in an Augment Reality Environment, according to the training stage of the user, or any other setup defined by the therapist. In so doing, those techniques allow for a considerable reduction of physical and metal efforts usually necessary to master the control of a prosthetic device.

Advertisement

6. Future directions

Despite the progress achieved so far, the authors believe that, as technology advances, the use of virtual and augmented reality for controlling myoelectric prostheses should also undergo continuous improvements. These future developments should be focused on issues such as: (i) improving the modeling of the virtual devices, in order to increase the sense of realism when compared to actual prostheses; (ii) new adaptive protocols for controlling the virtual prosthesis, so that it could emulated different strategies and different joint actuators; and (iii) the design of new devices to provide physiological feedback, allowing the user to "feel" what the virtual prosthesis is actually doing, thus, increasing the feeling of a complete mix between the real and virtual worlds.

Advertisement

Acknowledgement

The authors would like to express their gratitude to “Coordenação de Aperfeiçoamento de Pessoal de Nível Superior” (CAPES - Brazil), “Conselho Nacional de Desenvolvimento Científico e Tecnológico” (CNPq – Brazil) and “Fundação de Amparo à Pesquisa do Estado de Minas Gerais” (FAPEMIG – MG – Brazil) for the financial support.

References

  1. 1. DavoodiR.LoebG. E.RealReal-Time Animation Software for Customized Training to Use Motor Prosthetic Systems. IEEE Transactions on Neural Systems and Rehabilitation Engineering 2012202134142
  2. 2. SchemeE.EnglehartK.Electromyogram Pattern Recognition for Control of Powered Upper-Limb Prostheses: State of the Art and Challenges for Clinical Use. Journal of Rehabilitation Research & Development 2011486643660
  3. 3. Simon AM, Hargrove LJ, Lock BA, Kuiken TA.Target Achievement Control Test: Evaluating Real-time Myoelectric Pattern-recognition Control of Multifunctional Upper-limb Prostheses. Journal of Rehabilitation Research & Development 2011486619628
  4. 4. SuY.FisherM. H.WolczowskiA.BellG. D.BurnD. J.GaoR. X.Towards an EMG-Controlled Prosthetic Hand Using a 3-D Electromagnetic Positioning System. IEEE Transactions on Instrumentation and Measurement 2007561178186
  5. 5. GEAKE TH.An Advanced Feedback System for Myoelectrically Controlled Prostheses. MPhil Thesis. School of Computer Science and Electronic Systems, Kingston University, UK, 1994
  6. 6. Patterson PE, Katz JA.Design and Evaluation of a Sensory Feedback System that Provides Grasping Pressure in a Myoletric Hand. Journal of Rehabilitation Research and Development 1992
  7. 7. Paciga JE, Richard PD, Scott RN.Error rate in five-state myoelectric control systems. Medical and Biological Engineering and Computing 198018328790
  8. 8. Scott RN, Parker PA. Myoeletric Prosthesis: State of the Art.Journal of Medical Engineering & Technology 198812143
  9. 9. LambD. H.LawH.UpperUpper-Limb Deficiencies in Children- Prosthetic, Orthotic and Surgical Management. Boston, USA: Little, Brown and Company, 1987
  10. 10. Soares AB.Shape Memory Alloy Actuators for Upper Limb Prostheses. PhD Thesis. University of Edinburgh; 1997
  11. 11. Childress DS. Powered Limb Prostheses: Their Clinical Significance. IEEE transactions on biomedical engineering1973BME-20(May) 200207
  12. 12. Cram JR.The history of surface electromyography. Applied Psychophysiology and Biofeedback 20032881
  13. 13. HorowitzS.BiofeedbackApplications. A.Surveyof.ClinicalResearch.Alternative and Complementary Therapies 2006126275281
  14. 14. MAOskoeiHu. H.Support vector machine-based classification scheme for myoelectric control applied to upper limb. IEEE Transactions on Biomedical Engineering 2008Aug) 19561965
  15. 15. DoringerJ. A.HoganN.Performance of Above Elbow Body-Powered Prostheses in Visually Guided Unconstrained Motion Tasks. IEEE Transactions on Biomedical Engineering 1995426621631
  16. 16. KruitJ.CoolJ. C.Body-powered Hand Prosthesis with Low Operating Power for Children. Journal of Medical Engineering & Technology 198913129
  17. 17. Andrade AO. Methodology for EMG Signal Classification for the Control of Artificial Members.MSc Dissertation. Federal University of Uberlândia, Brazil; 2000
  18. 18. KyberdP. J.LemaireE. D.SchemeE.MacPhail. C.GoudreauL.BushG.BrookeshawM.Two-degree-of-freedom Powered Prosthetic Wrist. Journal of Rehabilitation Research & Development 2011486609618
  19. 19. Hoff LA.Errors in Frequency Parameters of EMG Power Spectra. IEEE transactions on biomedical engineering 1991381077
  20. 20. O’Neill PA, Morin EL, Scott RN.Myoletric Signal Characteristics from Muscles in Residual Upper Limbs. IEEE Transactions on Rehabilitation Engineering 19942266
  21. 21. SoaresA. B.AndradeA. O.Lamounier JrE. A.CarrijoR.The Development of a Virtual Myoelectric Prosthesis Controlled by an EMG Pattern Recognition System Based on Neural Networks. Journal of Intelligent Information Systems 200321212741
  22. 22. CardosoA.KirnerC.Lamounier JrE. A.KelnerJ.Technologies to the development of virtual and augmented reality systems; Editora Universitária da UFPE, Portuguese version, 2007
  23. 23. Sandia National Laboratories. BioSimMER- Virtual Life-saver. http://www.sandia.gov/media/NewsRel/NR1999/biosim.htm (accessed 25April 2012
  24. 24. KirnerC.KirnerT. G.Virtual Reality and Augmented Reality Applied to Simulation Visualization. In: El Sheikh, A.A.R.; Al Ajeeli, A.; Abu-Taieh, E.M.O. (Org.). Simulation and Modelling: Current Technologies and Applications. 1st ed. Hershey-NY: IGI Publishing, 20071391
  25. 25. WeidlichD.SchererS.WabnerM.AnalysesUsing. V. R. A. R.VisualizationI. E. E. E.ComputerGraphics.Visualization2008Sept/Oct 8486
  26. 26. GNU General Public License. http://www.gnu.org/licenses/gpl.html (accessed 5February 2007
  27. 27. AzumaR.BailotY.BehringerR.FeinerS.JulierS.MacIntyre. B.Recent Advances in Augmented Reality. IEEE Computer Graphics and Applications 20012163447
  28. 28. SoaresA. B.Lamounier JrE. A.LopesK.AndradeA. O.AugmentedReality. A.Toolfor.MyoelectricProstheses.In: ISEK 2008XVIIth Congress of the International Society of Electrophysiology and Kinesiology, ISEK20081821June 2008, Niagara Falls, Canada.
  29. 29. LiuW.NaturalUser.Interface-NextMainstream.ProductUser.InterfaceIn: 2010IEEE 11th International Conference on Computer-Aided Industrial Design & Conceptual Design (CAIDCD), 1719November 2010; 1 203- 205.
  30. 30. SantosE.Lamounier JrE. A.CardosoA.Augmented Reality Interaction with Kinect Device. In: Proceedings of the XIII Brazilian Symposium on Virtual and Augmented Reality, SVR20112326May 2011
  31. 31. RivaG.Virtual Reality for Health Care: The Status of Research. CyberPsychology & Behavior 200253219225
  32. 32. MarescauxJ.RubinoF.ArenasM.MutterD.SolerL.Augmented-Reality-AssistedLaparoscopic.AdrenalectomyThe Journal of the American Medical Association 20042921822142215
  33. 33. PayandehS.ShiF.Interactive-ModalMulti.SuturingVirtual.Reality2010144241253
  34. 34. JuanM. C.AlcanizM.MonserratC.BotellaC.BanosR.GuerreroB.Using Augmented Reality to Threat Phobias. IEEE Computer Graphics and Aplications 20052563137
  35. 35. LimaL.CardosoA.NakamotoP.LopesE.Lamounier JrE. A.Development of a Computer Tool to Aid Arachnophobia Treatment with Augmented Reality. In: Proceedings of the XIII Brazilian Symposium on Virtual and Augmented Reality, SVR20112326May 2011
  36. 36. PonsJ. L.CeresR.RoconE.LevinS.MarkovitzI.SaroB.ReynaertsD.Van MoorleghemW.BuenoL.Virtual reality training and EMG control of the MANUS hand prosthesis. Robotica 200523311doi:10.1017/S026357470400133X
  37. 37. ResnikL.EtterK.KlingerS. L.KambeC.Using Virtual Reality Environment to Facilitate Training with Advanced Upper-Limb Prosthesis. Journal of Rehabilitation Research & Development 2011486707718
  38. 38. Soares AB, Veiga ACP, Andrade AO, Pereira AC, Barbar JS.Functional Languages in Signal Processing Applied to Prosthetic Limb Control. Systems Analysis Modelling Simulation 2002421377
  39. 39. Lamounier JrE. A.LopesK.CardosoA.SoaresA. B.Using Augmented Reality Techniques to Simulate Myoelectric Upper Limb Prostheses. Journal of Bioengineering & Biomedical Science 2012S1:010. doi:10.4172/21559538 .S1-010

Written By

Alcimar Barbosa Soares, Edgard Afonso Lamounier Júnior, Adriano de Oliveira Andrade and Alexandre Cardoso

Submitted: 16 February 2012 Published: 17 October 2012