Open access

Recognition of Finger Motions for Myoelectric Prosthetic Hand via Surface EMG

Written By

Chiharu Ishii

Submitted: 16 November 2010 Published: 29 August 2011

DOI: 10.5772/21812

From the Edited Volume

Advances in Mechatronics

Edited by Horacio Martinez-Alfaro

Chapter metrics overview

4,105 Chapter Downloads

View Full Metrics

1. Introduction

Recently, myoelectric prosthetic arms/hands, in which arm/hand gesture is distinguished by the identification of the surface electromyogram (SEMG) and the artificial arms/hands are controlled based on the result of the identification, have been studied (Weir, 2003). The SEMG has attracted an attention of researchers as an interface signal of an electric actuated arm for many years, and many of studies on the identification of the SEMG signal have been executed. Nowadays, it can be said that the SEMG is the most powerful source of control signal to develop the myoelectric prosthetic arms/hands.

From the 1970s to the 1980s, elementary pattern recognition technique such as linear discriminant analysis, was used for the identification of the SEMG signals in (Graupe et al., 1978) and (Lee et al., 1984). In the 1990s, research on learning of a nonlinear map between the SEMG pattern and arm/hand gesture using a neural network has been performed in (Hudgins et al., 1993). Four kinds of motions of the forearm were distinguished by combining Hopfield-type neural network and back propagation neural network in (Kelly et al., 1990).

The amplitude and the frequency band are typical information extracted from the SEMG signal, which can be used for the identification of arm/hand gesture. (Ito et al., 1992) presumed muscle tension from the EMG signal, and tried to control the forearm type myoelectric prosthetic arm driven by ultrasonic motor. (Farry et al., 1996) has proposed a technique of teleoperating the robot hand through the identification of frequency spectrum pattern of the SEMG signal.

At present, however, most of the myoelectric prosthetic arms/hands can only realize some limited motions such as palmar seizure, flexion-extension of a wrist, and inward-outward rotation of a wrist. To the best of our knowledge, myoelectric prosthetic hands which can distinguish motions of plural fingers and can independently actuate each finger have not been developed yet, since recognition of independent motions of plural fingers through the SEMG is fairly difficult.

Probably, a present cutting edge practical myoelectric prosthetic hand is the "i-LIMB Hand" produced by Touch Bionics Inc.. However, myoelectric prosthetic hands which imitate the hand of human, such as the "i-LIMB Hand", are quite expensive, since they require accurate measurement of SEMG signal and use many actuators to drive finger joints. Therefore, improvement of operativity of the myoelectric prosthetic arms/hands and simplification of structure of the artificial arms/hands to lower the price are in demand.

The purpose of this study is to develop a myoelectric prosthetic hand which can independently actuate each finger and can realize fundamental motions, such as holding and grasping, required in daily life. In order to make it budget price, an underactuated robotic hand structure which realizes flexion and extension of fingers by tendon mechanism, is introduced. In addition, the "fit grasp mechanism" in which the fingers can fit the shape of the object when the fingers grasp the object, is proposed. The "fit grasp mechanism" makes it possible for the robotic hand to grasp a small object, a cylindrical object, a distorted object, etc.. In this study, a robotic hand with the thumb and the index finger was designed and built as a prototype.

As for the identification of independent motion of each finger, using the neural network, an identifier which distinguishes four finger motions, namely flexion and extension of the thumb and the index finger in respective metacarpophalangeal (MP) joint, is constructed. Four patterns of neural network based identifiers are proposed and the recognition rates of each identifier are compared through simulations and experiments. The online control experiment of the built robot hand was conducted using the identifier which showed the best recognition rate.

Advertisement

2. Robot hand

In this section, details of the robot hand for myoelectric prosthetic hand are explained.

Overview of the built underactuated robot hand with two fingers, namely the thumb and the index finger, is shown in Fig.1.

Figure 1.

Overview of robot hand.

2.1. Specifications

The primary specifications of the robot hand are shown as follows.

  1. Entire hand: 500mm total length, and 50mm thickness

  2. Palm: 100mm length, 110mm width, and 20mm thickness

  3. Finger: 100mm length, 15mm width, and 10mm thickness

  4. Pinching force when MP joint is driven: 3N

2.2. Mechanism of finger

As shown in Fig.2, imitating the human's frame structure, the robot hand has finger mechanism which consists of three joints, namely distal interphalangeal joint (DIP: the first joint), proximal interphalangeal joint (PIP: the second joint), and metacarpophalangeal joint (MP: the third joint). The fingers are driven by the wire actuation system like human's tendon mechanism. When the wire connected with each joint is pulled by driving force of the actuator, the finger bends. While, when the tension of the wire is loosed, the finger extends due to the elastic force of the rubber. This makes it possible to omit actuators used to extend the finger. The built robot hand can realize fundamental operation required in daily life, such as holding and grasping.

Figure 2.

Mechanism of finger.

2.3. Fit grasp mechanism

In general, when human holds the object, the fingers flexibly fit the shape of the object so that the object can be wrapped in. We call this motion "fit grasp motion". As shown in Fig.3, the finger of the robot hand has two kinds of wires which perform interlocked motion in DIP and PIP joints and motion in MP joint respectively. Therefore, the interlocked bending in DIP and PIP joints and the bending in MP joint can be performed independently.

Figure 3.

Arrangement of wires.

In addition, as shown in Fig.3, the ring is attached to the wire between DIP joint and PIP joint, and the interlocked motion of DIP and PIP joints is achieved by pulling the ring by other wire connected to the ring. This mechanism allows to realize "fit grasp motion". We call this mechanism "fit grasp mechanism." Details of the "fit grasp motion" are illustrated in Fig.4.

Figure 4.

Bending motion by fit grasp mechanism.

In the case where there is no object to hold, when the wire is pulled by the actuator, DIP and PIP joints bend at the almost same angle (Fig.4 upper). On the other hand, in the case where there is object to hold, when the object contacts the finger, only one side of the wire is pulled since the wire between DIP joint and PIP joint can slide inside of the ring. As a result, DIP joint can bend in accordance with the shape of the object (Fig.4 lower). Thus, "fit grasp motion" is achieved. The "fit grasp mechanism" makes it possible for the robotic hand to grasp a small object, a cylindrical object, a distorted object, etc..

Advertisement

3. Measurement and signal processing of SEMG

In this section, measurement and signal processing of the SEMG are described.

3.1. Measurement positions of SEMG

The built robot hand for myoelectric prosthetic hand has thumb and index finger to operate, and the thumb and the index finger are operated independently. Various motions of each finger can be considered, however in this study, flexion and extension of the thumb and the index finger in MP joint are focused on. Namely, flexion and extension in interlocked DIP and PIP joints are not considered here. Inward rotation and outward rotation of each finger are also not taken into consideration.

The measurement positions of SEMG are shown in Fig.5. Those are the following three positions; the vicinity of a musculus flexor carpi radialis / a musculus flexor digitorum superficialis (ch1), the vicinity of a musculus flexor digitorum profundus (ch2), and the vicinity of a musculus extensor digitorum (ch3). The former two musculuses are used for flexion of each finger and the latter musculus is used for extension of each finger.

Figure 5.

Measurement positions of SEMG

3.2. Signal processing

One finger motion is performed in approximately 0.5 second, and the SEMG signal is measured by 1kHz of sampling frequencies. Fast Fourier Transform (FFT) is performed to the measured SEMG signals, and spectral analysis is conducted. The number of samples for FFT was set as 256. When performing FFT, the humming window function was utilized to the processing signals.

However, influence of the alternate current (AC) power source, which is regarded as an external noise, appears in the amplitude value of the SEMG by which FFT processing was carried out. This AC power source noise appears at odd times frequencies of the fundamental frequency. Since the area where this experiment was conducted is East Japan, as shown in Fig.6, the influence of the AC power source noise appears at 50Hz and 150Hz.

Figure 6.

Spectrum of SEMG signal

Since it is considered that least influence of the AC power source noise is at 100Hz, the amplitude value at 100Hz is used for recognition of the finger motions.

Three-dimensional graph of the amplitude values at 100Hz of each motion in MP joint is shown in Fig.7, in which each measurement position, namely each electrode, is taken as an axis of the coordinates. In addition, the distribution in Fig.7 was divided into the distribution along the thumb and the index finger respectively, which are shown in Fig.8 and Fig.9.

Figure 7.

Distribution of amplitude values at 100 Hz.

Figure 8.

Distribution of amplitude values at 100 Hz (thumb).

Figure 9.

Distribution of amplitude values at 100 Hz (index finger).

The amplitude values of flexion of the index finger are distributed over the whole space. Hence, it is anticipated that it is hard to distinguish the flexion of the index finger from extension of the index finger. However, in the case of the thumb, the distribution seems to be distinguishable to flexion or extension. Therefore, these values are used for recognition of the finger motions.

Advertisement

4. Recognition of finger motions

In this section, the identification methods of finger motions using neural network(s) are explained.

4.1. Recognition by one neural network

The recognition of the finger motions is performed via SEMG signals using neural network(s). First of all, an identifier which distinguishes four finger motions, namely 1) the flexion of the thumb in MP joint, 2) the extension of the thumb in MP joint, 3) the flexion of the index finger in MP joint, and 4) the extension of the index finger in MP joint, by only one neural network is constructed.

The input signals to the neural network are set of the amplitude values at 100Hz obtained through the signal processing explained in Section 3.2 for the SEMG signal measured in each electrode. The numerical values 1 to 4, 1 for flexion of the thumb, 2 for extension of the thumb, 3 for flexion of the index finger and 4 for extension of the index finger, are assigned as the teacher signals for each motion.

As for the structure of the neural network, the feedforward neural network was adopted. The number of the input layer and of the output layer is one, respectively, and the number of the hidden layer, each consisting of three neurons, is two. 20 set of pre-measured input signals for each finger motion were used for learning of the neural network. The error back propagation algorithm was used as the learning method of the neural network. The learning of the feedforward neural network was executed using the Neural Network Toolbox in MATLAB software. As a condition of the learning, the end of the learning was set with the repetition of 10000 times calculation. Hereafter, this identification method, namely recognition of the four finger motions by one neural network, is described as identifier (x).

After the learning of the neural network, simulation was carried out using the 30 set of pre-measured input signals for each finger motion, which differs from the input signals used for the learning, and its recognition rate was examined. The results are shown in Table 1.

MotionFlexion of thumbExtension of thumbFlexion of index fingerExtension of index fingerAverage
Recognition rate [%]43.30.036.63.320.8

Table 1.

Simulation results with identifier (x).

From Table 1, the successful recognition rate was only 20.8% on average for use of the identifier (x).

4.2. Improvement of identification method

The improvement of the recognition rate can be expected by modifying the identification method. By reducing the number of choice which one neural network should distinguish, and combining two or more neural networks in series or in parallel, much higher successful recognition rate will be obtained for each finger motion. In each neural network, the choice is given as alternative, namely the numerical values 0 and 1 are given as the teacher signals. The following three patterns of identifier shown in Fig.10 are considered.

Figure 10.

Improved identification methods.

In identifier (a), the finger motion is distinguished by recognizing each one motion by one neural network in order with high successful recognition rate for each finger motion. The order of recognition by neural network was determined through simulation results, as flexion of the index finger, extension of the thumb and extension of the index finger. First, N.N.-1 identifies whether the finger motion is flexion of the index finger. N.N.-1 is trained to output 1 for flexion of the index finger and 0 for other finger motions by the learning. Therefore, N.N.-1 outputs 1 in the case where the finger motion is identified as flexion of the index finger. If the output of N.N.-1 is 0, likewise N.N.-2 identifies whether the finger motion is extension of the thumb. Again, if the output of N.N.-2 is 0, N.N.-3 identifies whether the finger motion is extension of the index finger. Finally, in the case where the finger motion was not identified as any of these three motions, it is finally recognized as flexion of the thumb. This identification method has drawback that incorrectly-identified finger motions are inevitably distinguished as flexion of the thumb.

In identifier (b), N.N.-1 is trained to output 1 for flexion of the thumb and the index finger and 0 for extension of the thumb and the index finger, and N.N.-2 and N.N.-3 are trained to output 1 for motion of the thumb and 0 for motion of the index finger. Firstly the flexion or the extension is distinguished, then motion of the thumb or motion of the index finger is distinguished. Thus, finally motion of the finger is distinguished to one of the finger motions.

In identifier (c), N.N.-1 is trained to output 1 for motion of the thumb and 0 for motion of the index finger, and N.N.-2 and N.N.-3 are trained to output 1 for the flexion and 0 for the extension. Firstly, motion of the thumb or motion of the index finger is distinguished, then the flexion or the extension is distinguished. Thus, finally motion of the finger is distinguished to one of the finger motions.

In identifier (b) and identifier (c), the first distinction is important, since if the first distinction is incorrect, subsequent distinction becomes meaningless. Therefore, high successful recognition rate of the first distinction is required.

The structure of each neural network and the learning method are same as the identifier (x). Namely, in each identifier, the error back propagation algorithm was used as the learning method for respective feedforward neural network.

4.3. Simulation with improved identification method

Simulation works for distinction of the finger motions were carried out for each identifier, and each recognition rate was examined. In each identifier, simulation was carried out using the 30 set of pre-measured input signals for each finger motion, which differs from the input signals used for the learning. The result using the identifier (a) is shown in Table 2.

MotionFlexion of thumbExtension of thumbFlexion of index fingerExtension of index fingerAverage
Recognition
rate [%]
66.730.090.043.357.5

Table 2.

Simulation results with identifier (a).

Table 2 shows that the recognition rate was improved compared with the identifier (x). However, since most of the incorrectly-identified motions are distinguished as flexion of the thumb, improvement is required.

Before simulating entire recognition rate using identifier (b), recognition rate of each neural network in identifier (b) was examined. In the simulation, the 30 set of input signals for each finger motion were used in N.N.-1. Each 30 set of input signals for flexion of the thumb and flexion of the index finger were used in N.N.-2, and each 30 set of input signals for extension of the thumb and extension of the index finger were used in N.N.-3. The results are shown in Table 3.

N.N.-1FlexionExtensionAverage
Recognition rate [%]83.376.780.0
N.N.-2Flexion of thumbFlexion of index fingerAverage
Recognition rate [%]93.386.790.0
N.N.-3Extension of thumbExtension of index fingerAverage
Recognition rate [%]53.373.363.3

Table 3.

Partial recognition rate for each neural network in identifier (b).

From Table 3, the recognition rate of 80% on average was obtained at the first distinction. The result of entire recognition rate using identifier (b) is shown in Table 4.

MotionFlexion of thumbExtension of thumbFlexion of index fingerExtension of index fingerAverage
Recognition
rate [%]
66.733.383.363.361.65

Table 4.

Simulation results with identifier (b).

Table 4 shows that the entire recognition rate was improved as well as identifier (a).

Likewise, before simulating entire recognition rate using identifier (c), recognition rate of each neural network in identifier (c) was examined. In the simulation, the 30 set of input signals for each finger motion were used in N.N.-1. Each 30 set of input signals for flexion of the thumb and extension of the thumb were used in N.N.-2, and each 30 set of input signals for flexion of the index finger and extension of the index finger were used in N.N.-3. The results are shown in Table 5.

N.N.-1ThumbIndex fingerAverage
Recognition rate [%]80.055.067.5
N.N.-2Flexion of thumbExtension of thumbAverage
Recognition rate [%]80.010090.0
N.N.-3Flexion of index fingerExtension of index fingerAverage
Recognition rate [%]83.390.086.7

Table 5.

Partial recognition rate for each neural network in identifier (c).

From Table 5, the recognition rate of the first distinction was only 67.5% on average, which is inferior to the case of the identifier (b). The result of entire recognition rate using identifier (c) is shown in Table 6.

MotionFlexion of thumbExtension of thumbFlexion of index fingerExtension of index fingerAverage
Recognition
rate [%]
70.070.076.713.357.5

Table 6.

Simulation results with identifier (c).

From Table 6, the entire recognition rate of 57.5% on average was obtained, which is almost same level as the identifier (a).

From the above results, it turned out that the recognition rate was improved in all identifiers (a), (b) and (c), compared with identifier (x). In addition, from the recognition rate of N.N.-1 in Table 3 and of N.N.-2 and N.N.-3 in Table 5, it can be said that distinction between the flexion and the extension is comparatively easy as compared with distinction between the thumb and the index finger.

Advertisement

5. Experiments

In this section, experimental results for online recognition and for online finger operation of the robot hand described in Section 2 are shown.

5.1. Experiment for online recognition

Experiment for online recognition of the finger motions was carried out using the identifier (b) which showed the most successful recognition rate on average, and recognition rate was examined. Each finger motion was performed in 1 second at intervals of about 10 seconds. The result of the recognition rate for 60 times of movements for each finger motion is shown in Table 7.

MotionFlexion of thumbExtension of thumbFlexion of index fingerExtension of index fingerAverage
Recognition
Rate [%]
68.333.381.860.060.8

Table 7.

Experimental results for online recognition with identifier (b).

Compared with the simulation results shown in Table 4, quite similar results were obtained.

5.2. Experiment for online control of robot hand

Experiment for online finger operation of the robot hand was executed. In the experiment, the SEMG of each electrode is measured online. Then, the start time of finger motion is detected as follows. Since the SEMG of ch1 has least noise and good response among the SEMG of ch1, ch2 and ch3, the SEMG of ch1 is rectified, and when the magnitude of the rectified SEMG exceeds a specified threshold, it is regarded as finger motion having begun. Synchronizing with the start of the finger motion, online recognition of the finger motion using the identifier (b) is carried out. In the experiment, the finger motion is performed in 1 second at intervals of about 5 seconds, respectively, and is performed in order with flexion of the thumb, extension of the thumb, flexion of the index finger, and extension of the index finger. The fingers of the robot hand are controlled based on the recognition result of the identifier.

The measured SEMG in each electrode when a series of finger motion was performed is shown in Fig.11.

Figure 11.

SEMG in each electrode.

Figure 12.

Judgment of start of finger motion

The SEMG of ch1 in Fig.11 was rectified, and the time when the magnitude of the rectified SEMG exceeded the threshold determined as 9mV, was regarded as the start time of the finger motion. The rectified SEMG of ch1 is shown at the top of Fig.12, and the transition of the finger motion based on the judgment of the start of the finger motion is shown at the bottom of Fig.12, in which "output 1" shows the flexion and "output 0" shows the extension.

Synchronizing with the start time of the finger motion shown in Fig.12, the input signal to the neural networks in the identifier is updated. The recognition result of each neural network in the identifier is shown in Fig.13.

N.N.-1 distinguishes the flexion or the extension. The output larger than 0.5, which is regarded as 1, is judged to be the flexion, and the output less than 0.5, which is regarded as 0, is judged to be the extension. N.N.-2 and N.N.-3 distinguish motion of the thumb or motion of the index finger. The output larger than 0.5, which is regarded as 1, is judged to be motion of the thumb, and the output less than 0.5, which is regarded as 0, is judged to be motion of the index finger.

In Fig.13, the blue dashed line shows the output from the neural network, and the red solid line shows the recognition result, in which "Output 1" shows the flexion and "Output 0" shows the extension for N.N.-1, and "Output 1" shows the motion of the thumb and "Output 0" shows the motion of the index finger for N.N.-2 and N.N.-3.

Figure 13.

Recognition result for each neural network.

When Fig.12 and Fig.13 are compared, a slight time delay is seen until the recognition result is obtained by the neural network after the start of the finger motion was judged, since it takes a slight time to calculate the input signal to the neural network due to the FFT processing and so on.

The entire recognition result of finger motion obtained from combination of the recognition results of the three neural networks in Fig.13 is shown in Fig.14, in which "Output 4" shows the flexion of the thumb, "Output 3" shows the extension of the thumb, "Output 2" shows the flexion of the index finger, and "Output 1" shows the extension of the index finger. Initial motion for 0 second to 5 seconds which has not operated any motion is judged as extension of the thumb due to the learning of the neural network.

Figure 14.

Recognition result of finger motion.

Figure 15.

Finger operation of robot hand.

From Fig.14, the recognition of the finger motion was performed correctly.

The operation of the robot hand is determined as follows. In both thumb and index finger, extension is in the state where the finger is lengthened, and flexion is in the state where the finger is bent at 90 degrees. In control of the thumb, the reference value for the flexion was set as 0.95 rad, which is the rotation angle of the pulley required to make the thumb bent at 90 degrees. On the other hand, the reference value for the extension was set as 0 rad in order to return the thumb to the original state. Likewise, in control of the index finger, the reference value for the flexion was set as 2.86 rad, which is the rotation angle of the pulley required to make the index finger bent at 90 degrees. On the other hand, the reference value for the extension was set as 0 rad in order to return the index finger to the original state. PI controllers were adopted in the construction of the servo systems, and the controller gains of the PI controllers were adjusted by trial and error through repetition of the experiments. The result of the operation of the robot hand is shown in Fig.15, in which the blue dashed line shows the reference value of the rotation angle of the pulley, and the red solid line shows the rotation angle of the actual pulley driven by motor.

The result shown in Fig.15 is the case where the recognition results of the finger motion for both the thumb and the index finger were successful, and both the thumb and the index finger of the robot hand were able to be operated as intended.

Advertisement

6. Conclusion

In this chapter, a robot hand for myoelectric prosthetic hand, which has thumb and index finger and can realize "fit grasp motion", was proposed and built. In order to control each finger of the developed myoelectric prosthetic hand independently, recognition of four finger motions, namely flexion and extension of the thumb and of the index finger in MP joint, was performed based on the SEMG using the neural network(s). First, recognition of these four finger motions by one neural network was executed. Then, the successful recognition rate was only 20.8% on average. In order to improve the recognition rate, three types of the improved identification methods were proposed. Simulation results for the recognition rate using each improved identification method showed successful recognition rate of more than 57.5% on average. Experiment for online recognition of the finger motions was carried out using the identification method which showed the most successful recognition rate, and similar recognition result as in the simulation was obtained. Experiment for online finger operation of the robot hand was also executed. In the experiment, the fingers of the robot hand were controlled online based on the recognition result by the identifier via SEMG, and both the thumb and the index finger of the robot hand were able to be operated as intended.

Advertisement

Acknowledgments

The author thanks Dr. T. Nakakuki for his valuable advices on this research, and thanks Mr. A. Harada for his assistance in experimental works.

References

  1. 1. FarryK. A.WalkerI. D.BaraniukR. G.1996Myoelectric Teleoperation of a Complex Robotic Hand, IEEE Transactions on Robotics and Automation, 5775787
  2. 2. GraupeD.MagnussenJ.BeexA. A. M.1978A Microprocessor System for Multifunctional Control of Upper Limb Prostheses via Myoelectric Signal Identification, IEEE Transactions on Automatic Control, 234538544
  3. 3. HudginsB.ParkerP. A.ScottR. N.1993A new strategy for multifunction myoelectric control, IEEE Transactions on Biomedical Engineering, 4018294
  4. 4. ItoK.TsujiT.KatoA.ItoM.1992An EMG Controlled Prosthetic Forearm in Three Degrees of Freedom Using Ultrasonic Motors, Proceedings of the Annual Conference the IEEE Engineering in Medicine and Biology Society, 1414871488
  5. 5. KellyM. F.ParkerP. A.ScottR. N.1990The Application of Neural Networks to Myoelectric Signal Analysis: A preliminary study, IEEE Transactions on Biomedical Engineering, 373211230
  6. 6. LeeS.SaridisG. N.1984The control of a prosthetic arm by EMG pattern recognition, IEEE Transactions on Automatic Control, 294290302
  7. 7. WeirR.2003Design of artificial arms and hands for prosthetic applications, In Standard Handbook of Biomedical Engineering & Design, Kutz, M. Ed. New York: McGraw-Hill, 32

Written By

Chiharu Ishii

Submitted: 16 November 2010 Published: 29 August 2011