Open access peer-reviewed chapter

Biomimetic Based EEG Learning for Robotics Complex Grasping and Dexterous Manipulation

Written By

Ebrahim A. Mattar, Hessa J. Al-Junaid and Hamad H. Al-Seddiqi

Submitted: July 3rd, 2017 Reviewed: November 14th, 2017 Published: December 30th, 2017

DOI: 10.5772/intechopen.72455

Chapter metrics overview

1,146 Chapter Downloads

View Full Metrics


There have been tremendous efforts to understand the biological nature of human grasping, in such a way that it can be learned and copied to prosthesis–robotics and dextrous grasping applications. Several biomimetic methods and techniques have been adopted, hence applied to analytically comprehend ways human performs grasping to duplicate human knowledge. A major topic for further study, is related to decoding the resulting EEG brainwaves during motorizing of fingers and moving parts. To accomplish this, there are a number of phases that are performed, including recording, pre-processing, filtration, and understanding of the waves. However, there are two important phases that have received substantial research attentions. The classification and decoding, of such massive and complex brain waves, as they are two important steps towards understanding patterns during grasping. In this respect, the fundamental objective of this research is to demonstrate how to employ advanced pattern recognition methods, like fuzzy c-mean clustering for understanding resulting EEG brain waves, in such a way to control a prosthesis or robotic hand, while relying sets of detected EEG brainwaves. There are a number of decoding and classification methods and techniques, however we shall look into fuzzy based clustering blended with principle component analysis (PAC) technique to help for the decoding mechanism. EEG brainwaves during a grasping and manipulation have been used for this analysis. This involves, movement of almost five fingers during a grasping defined task. The study has found that, it is not a straight forward task to decode all human fingers motions, as due to the complexity of grasping tasks. However, the adopted analysis was able to classify and identify the different narrowly performed and related fundamental events during a simple grasping task.


  • biomimetic
  • electroencephalography
  • PCA
  • fuzzy clustering
  • Dextrous grasping
  • prosthesis
  • robotic hand
  • robotics control

1. Introduction

1.1. Biomimetic engineering for EEG applications

Human brain and other biological brains are the most complicated part of any intelligent life forms, it controls almost every part in the human body. For controlling biological organs via a brain, there must be communication between organs. After years of research, it was established that brains communicate with organs via massive set of electrical signals that are periodically sent to various body parts. These electrical signals, can be detected through electroencephalography (EEG). These EEG waves are sent from neurons in the brain to the ones in the spinal cord, and end in nerve endings in the organs. Neurons are connected together, in addition to nerves using synapses. This is the basic structure of the nervous system that is in charge of communications between the brain and the rest of the body. Brain signals are basically electrical currents, however connecting them to artificial prosthesis, and robotic hand has always been a challenge.

Given this fact, biomimetics is employed for imitation of models, systems, in addition to elements of nature, Vincent et al. [1]. This is used for a purpose of solving complex human related problems. Lately, electroencephalography(EEG)–has received considerable attention, due to a number of advantages while data mining and dealing with brain waves. There are tremendous efforts to make use of biomimicry for EEG waves understanding, [2, 3, 4, 5]. In this regard, Yuanfang et al. [6], stated they have used two novel classifiers: biomimetic pattern recognition and sparse representation. Each classifier labels the unlabelled data for other classifiers to extend the labeled set. An enlarged labeled set is used to generate the final classifier BPR–SR. BPR–SR was constructed by combining BPR with SR. EEG features extracted by common spatial patterns were used for classification. Perruchoud et al. [7], presented the concept of biomimetic rehabilitation engineering, and presented a more focus review in this sense. The importance of somatosensory feedback for brain–machine interfaces were also presented. Menniti et al. [8] stated that they used implanted electrodes, to obtain acoustical information about the external environment generated by a biomimetic system and converted in electrical signals. Bullock et al. [9], data was collected by recording two male experienced machinist and two female housekeepers while working for at least 8 hours each. Classification was done using modified Feix taxonomy to work with video footage. Data showed that 80% of the time the housekeepers used only five grasps and the machinists 10 grasps. In [10] Bullock et al., it was found out that for basic objects handling, medium wrap and lateral pinch are most suited. For precision and dexterous manipulation, three fingertips were needed and thumb −two finger, tripod, or lateral tripod can be used.

In [11], two machinists and two housekeepers were recorded during typical day at work. A camera was head-mounted and was used to record hands when doing their work. With each right hand grasp, the data was tagged with grasp type and object properties. In Feix et al. [12], analysis of human grasping and behavior detection was done. Object characteristics and grasp types were reported. Mainly, 10,000 grasp instances were recorded and a correlation between the property of the object and the grasp type was established. In reference to [13, 14], various analysis of EEG were conducted using some time and frequency analysis approaches. In [15], Feix et al., investigated task classification according to degrees of freedom, required force, and the functional task type. Since these are thought to be natural considerations that humans think of when doing tasks. In [16, 17, 18], have presented various ways for EEG analysis. A versatile signal processing and analysis framework for EEG were conducted. Signals were decomposed into frequency sub-bands using DWT. Set of statistical features were extracted from the sub-bands to represent distribution of wavelet coefficients. Hazrati and Erfanian [19], have presented an online EEG-based brain–computer interface for controlling hand grasp using an adaptive probabilistic ANN. Jianjun et al. [20], have also implemented a robotics control through a set of EEG waves. Group of 13 human subjects willingly modulate brain activity to control a robotic arm.

Furthermore, due to complex robotic tasks, grasping by dextrous prosthesis–robotic hands, is no longer can be achieved through analytical approaches, as done in the past using mathematically defined force computations, as found in Vinet et al. [21], and Cutkosky [22]. However, human grasping experience and knowledge have been under massive studies to be transferred to robots, Mattar et al. [23]. This is due to the advanced applications robots and prosthesis are preforming nowadays.

1.2. Electroencephalography classification-decoding

In addition to the above introduction, recently human like hands have been developed world-wide, to achieve accurate grasping, Figure 1, [24, 25]. In parallel, EEG based robotics control systems have also received fundamental attention by researchers world-wide, Vinet et al. [21]. However, there are a number of issues related to interpretation of the massive waves. EEG signal classification using principal component analysis (PCA), independent component analysis (ICA), linear discriminant analysis (LDA) and SVM, were investigated by Subasi and Gursoy [26]. In their work, they have investigated using EEG in diagnostics of epileptic seizure. First, data was acquired from an outside source. These data contained EEG recordings from multiple seizure patients that have been diagnosed and found the region where the seizure accrued and another data from healthy people. The EEG signals then were transformed to sub-bands using discrete wavelet transform (DWT). DWT was used because the EEG signals are non-stationary and DWT is capable of handling such signals. Hence using statistical techniques, features related to seizures were extracted. These techniques reduce the data by removing the redundancy in it and focusing on the features. This is done by getting the low dimensional space of the features. The first technique used was principal component analysis (PCA) that reduces dimension and maximize variance in the data of the lower dimension hence make it easier to extract waves features. Independent component analysis (ICA) was used. This method generates mutually independent components from random signals. This highlights the features in the signals. Finally, linear discriminant analysis (LDA) was used. This method combines the predictors to generate a discriminant score. This resulted in discriminant scores normally distributed in each class. All the features from each technique was then submitted to support vector machine (SVM). The SVM then generated the classification of having an epileptic seizure or not. The accuracy of feature extraction was based on the specificity and sensitivity derived from the confusion matrices. The PCA had the lowest classification at 98.75%. ICA achieved second best result at 99.5%. LDA achieved a perfect score of 100%. The research showed that the generalization performance of SVM was improved by dimensional reduction. This whole system was proven to be eligible for the use in diagnostics.

Figure 1.

Current development of prosthesis, and robotic artificial limbs and hand.Left picture source with permission, Dustin, [24].Right source with permission, Touch Bionics, [25].

Wang et al. [27], have also shown how feature extraction and recognition of epileptiform activity in EEG by combining PCA with approximate entropy (ApEn) can be adopted. The paper also investigated the use of EEG signals to identify the epileptiform activity associated with epilepsy. The problems with older techniques for finding epileptiform activity was the speed, mainly due to the huge amounts of data. First, the EEG data was recorded and then was subjected to PCA to reduce the dimension of data, hence to identify patterns related to epileptiform activity based on the variances. DWT was applied on the PCA data to generate the sub-bands associated with signal types such as spike and sharp waves. Then, approximate entropy (ApEn) estimation was done of the sub-bands. The ApEn is used to find out if the time series is random or deterministic based on the value of the entropy (high or low respectively). Finally, to classify the data into epilepsy or not, Neyman-Pearson criteria was applied. This was done by getting a threshold value from the Neyman-Pearson criteria and comparing the ApEn value with the threshold. If the signal ApEn was less than a threshold, it is epileptic otherwise it is normal. The paper showed result is similar to other techniques, however because of PCA the detection speed was much faster due to reduction of data. Mahajan et al. [28] have done classification of EEG using PCA, ICA, and neural network. This is also another paper that focuses on the use of EEG signals to diagnose epilepsy. This research was similar to Subasi and Gursoy, [26]. EEG data was taken from an outside source. EEG signals were first divided into sub-bands using DWT. Then, some features related to epilepsy were extracted from the sub-bands using PCA and ICA. Then, features were sent to train a neural network that was used for classification. The neural network is made of two layers and five perceptron. The training was done using feedforward algorithm. The neural network was able to classify the EEG data and determine whether it is epileptic or not. The accuracy of feature extraction was based on the specificity and sensitivity. ICA achieved 96.75% accuracy and PCA achieved 93.63%. The work showed that ICA is better than PCA which was also established in by Subasi and Gursoy.

For electroencephalogram signal classification using neural networks with wavelet packet analysis, PCA and data normalization as pre-processors was introduced by Aminian et al. [29]. In this work, the authors try to predict hand/arm movement using EEG data. The EEG signals were recorded when subjects were asked to grasp an object, point to an object, and extend an arm. This data was then subjected to wavelet packet analysis to reduce the number of features to concentrate on the important ones. This was done by choosing approximate coefficients as features. Then, generates an approximation and detail levels from the each of the approximation and detail component by decomposing it. PCA is used to reduce features to reduce the amount of data while keeping the important parts. A uniform scaling scheme was applied to make sure that small important features are not taken for granted. This was achieved by normalization of the data by getting zero mean and unity standard deviation. Finally, a neural network was used for classification. It is a backpropagation trained neural network with feedforward multilayers. The neural network generates outputs as many as the classes (tasks). By doing so, each output corresponds to one of the classes and gives probabilities for the inputs and their class. The network was unaffected by outliers because it does not try to fit them due to its simplicity. Overtraining was also not an issue because of smooth mapping function of the input–output. This scheme achieved a 100% accuracy in the classification of the tasks for all the participants. Classification of direction perception EEG based on PCA-SVMwas also introduced by Wang and Wang, [30]. In this work, the paper uses EEG signals to identify movement based on vision. The EEGdata was recorded when subjects were looking at a monitor with 3D environment. This environment contained roads and there was movement in first person view. The objective was to identify the direction of movement perceived by these individuals.

Discrete Fourier transform (DFT) was first applied to these signals to examine them in the frequency domain. This allowed the identification of some features related to the perception of movement based on the power at certain frequencies. Then, PCA was used to reduce the dimension of the data and simplify the features. To classify the data, support victor machine SVM was used. To overcome some error induced by the penalty factor in SVM, a new variable was introduced to add degree of freedom margin in patter recognition. The two class classification achieved (50–91.25%) accuracy between all the subjects and the two way movement (left-right, right-front…). This could be due to the feature extraction technique used which is not able to distinguish between similar movements. The use of DWT may have been a better choice due its ability to handle non-stationary signals such as EEG.

Within this chapter, and given the above review, the adopted technique, we shall introduce another view for EEG analysis for prosthesis–robotics grasping. This is based on blending PCA with C-means clustering. The technique has an ability to detect the grasping events.

1.3. Chapter contribution and organization

1.3.1. Chapter contribution

This chapter is presenting a unique classification technique for EEG brainwaves during a defined grasping and lifting task. The study is presenting a mechanism for extracting focal features, and patterns intrinsics for resulting EEG waves during experimentally conducted grasping task. Definitely, knowledge of the features will help for robotics grasping based tasks. The novel presented work in this chapter, is summarized in Figure 2, and as follow:

Figure 2.

(Top): Overall system EEG-PCA detection of main components hierarchy. (Down): Layout of how the machine learning tools and techniques can be integrated with the detection process.

(i) A human grasping experimental task is initially conducted. The experiment was related to grasping of an object to perform a defined grasping task. Further details about source, and the EEG dataset will be further explained at later stage here in this chapter. Due to raw nature of the detected EEG waves, EEG brainwaves have been filtered, hence they are processed to be much easy to compute and analysis. (ii) To identify related events during the defined grasping task for different participates. In his respect, a PCA and fuzzy clustering were applied to the EEG dataset to identify the most identifiable wave characterization. Identified features are then used for copying a human behavioral motion as used for robotic hand.

1.3.2. Chapter organization

To achieve the stated objectives, the chapter has been structured as follow. A set of experiments were conducted with advanced EEG detection probes placed over the subject skull, as in Luciw et al. [31]. PCA has been used to detect data features, hence C-fuzzy clustering, is used to learn inherent characterization. Characterization will be further learned to a robotic system.


2. Gasping events detection

2.1. EEG gasping clustering for events detection

In order to detect different events during grasping or manipulations tasks, it is required to classify the data via clustering of resulting EEG waves. This mean to classify different performances. Based on above definitions, fuzzy clustering of the massive EEG dataset is precisely formulated as an optimization problem. This is to minimize the distances to which the different sampled EEG waves are belonging. Formulation of fuzzy clustering is as a minimization problem, i.e. to minimize distances clusters centers uij:


subject to the following clusters centers constraints:


One of the widely employed clustering methods based on Eq. (1) is the fuzzy c-means (FCM) algorithm. The objective function of the FCM algorithm is expressed in the form of,


In Eq. (3), mis known as exponential weight that influences the degree of fuzziness of the membership or partition matrix. To solve this minimization problem, the objective function

is differentiated in Eq. (3) with respect to vi(for fixed uij, i = 1,,c, j = 1,,n) and to uij(for fixed vi, i = 1,…,c). Applying constraints conditions, thus obtaining:


The system described by Eqs. (4) and (5) cannot be solved analytically. However, fuzzy c-means algorithm provides a computationally iterative approach to approximating the minimum of the objective function starting from a given position.

2.2. EEG gasping events: dimensionally reduction

2.2.1. Concept of Eigen EEG waves

Principal components analysis (PCA), aims to find total variation in the set of the used EEG waves, hence it helps to explain this variation by less number of variables. PCA does this by computing the basis of a space which is represented by EEG waves. The basis vectors computed by PCA are in direction of the largest variance of used EEG waves. These basis vectors are computed by solution of an eigen problem, and as such the basis vectors are eigenvectors.

In order to deal with the massive dataset resulting from the grasping, we need also to reduce the dimensionality of the data size, as most of the resulting brain waves look similar with very minor differences. This is summarized as follows: Finding the mean of the signals, hence compute the eigenvectors. First we need to find the mean of all the Ncolumns separately by:


In Eq. (6), x¯nis the mean of each Ncolumn (n = 1, 2, …, N).

The next step is to do cantering the massive EEG waves. Since PCA requires the matrix to be cantered by subtracting the mean from each column. This will cause the mean of each column to be zero.


where nis the column of the X matrix after subtracting the mean from the original column. This will cause the data to be moved close to the centre (origin) of the principal components. We then need also to compute the covariance matrix of the resulting EEG waves. It is required to get the covariance matrix which is the variance between all the columns to get the relationship between these columns in terms of the variance (multiple dimension variance). The variance is how much the data varies from its mean and the covariance is used to find some kind of a relationship between only two dimensions. For example, relationship between velocity, and car crashes. Therefore, the covariance matrix is just all the combinations of the covariance between each dimension and another. The value of the covariance determines the relationship between the two dimensions. If it is positive, then if one dimension increases the other will increase as well. If it is negative, then the relationship is inversely proportional and when one increase the other decreases. Finally, if it is zero, then the two dimensions are independent of each other or have a nonlinear relationship. The magnitude of the covariance will determine the amount of increase that will occur in the other dimension with maximum relationship of one to one. The covariance matrix:


In Eq. (8), nTis the transpose column of n. This is further expanded into:


Finally, the covariance matrix Cis further expressed by Subasi and Gursoy [6]:


The C, which is a square matrix, is said to be symmetric around the main diagonal because it was results of multiplication of a matrix and its transpose. The main diagonal of C, is the covariance between the dimension and it self. Therefore, once we are looking to find the relationship between the dimensions, we will look at the non-diagonal elements and judge based on their value, as in Subasi and Gursoy [26].

Finally, we need to compute the eigenvalues and eigenvectors. In this respect, to understand patterns of the normalized data, we need to understand the covariance Cof the data. This can be possible by getting some lines in the plot that will explain the data pattern based on the covariance of the signals. This can be achieved by getting the eigenvectors of the covariance matrix. We can find eigenvectors for the covariance matrix C, this is because of the square nature which is a requirement for eigenvector calculations, Subasi and Gursoy [26]. For an (n×n) matrix A,if we find a row vector X(n×1) that could be multiplied by Aand get the same vector Xmultiplied by a value λcalled an eigenvalue and the vector is an eigenvector. Since the matrix Atransforms the vector Xto scale positions by an amount equal to λ, it is called a transformation matrix as presented by:


There are (n) eigenvalues for an (n×n) transformation matrix. Since every eigenvector is scaled by an eigenvalue (λ), we shall have (n) eigenvectors as well. The eigenvalues (λ)is found by solving for while using the identity given by equation Eq. (12):


In Eq. (12), Iis an identity/unity matrix which does not alter value of a matrix it’s multiple with. Getting the determinant of |(A−Iλ)|, and solving for λ, results in finding eigenvalues. Substituting each (λ) in Eq. (12), and solving for (X), results in the eigenvector (X) for that λ.


3. Time-frequency analysis of grasping EEG waves

3.1. Time-frequency relation

It is essential to investigate both time and frequency domain relation among the detected EEG waves. These relations reveal where (in term of frequency), and when (in term of time), and when power densities are taking place during a course of thinking of grasping. EEGLAB in addition to other tools have been used to analyze the brain data both in time and frequency domain. EEGLAB is a Matlab based toolbox that was developed to further analyze the EEG waves both in time and frequency domains, as found in Delorme and Makeig [32]. This shall further observe results of using EEGLAB at different relations among the waves both time and frequency domains. The experiments were conducted by 11 participates. Further analysis on using the EEGLAB are presented at later stages. The grasping experiment can be further found as in reference to Luciw et al. [31]. The entire dataset (11 individuals grasping experiment), was passed into both a PCA algorithm, and fuzzy based clustering algorithm. PAC algorithm was used to reduce the dimensionality of such massive dataset. Fuzzy c-means clustering was used to identify a number of clusters, thus similar dataset events. It is well known, human do similar actions while doing a similar grasping task. We shall identify (P1), as the first individual that performed the grasping task, whereas (P9), for example is the ninth individual performing the experiment. Similar naming are given to other individuals.

3.1.1. Time-domain analysis of grasping waves

A) EXAMPLE, FIRST PARTICIPANT: First we loaded dataset for the first person (P1), and then plotted all channels in time domain. Signals are time stamped with labels of what they correspond to in action that happed in the experiment and was recorded by all of the sensors. These are just the main events and they include the LED turning on and off, when the hand starts moving, when each finger touches the plates, the object lift off the table and replacing it back to original position, if the new trial includes expected or unexpected high or low weights, and finally the release of the fingers from the plates. The all the possible labels are clearly shown in Table 1. In Figures 3 and 4, after the LED was turned on there was a minor increase in the voltage due to the intent to move in response to the event (ERP). If the hand started moving there was a minor increase in some channels and decrease in the reference channels due to hand movement. Consequently, as fingers touched the object, forces were applied. Minor change happened until the lift off, which caused a high-low voltage for lift and then relax in destination after (0.5 sec) from lifting off. In anticipation of LED turning on, there was an increase in voltage and a sudden spike once LED tuned off due to eye blinking. Finally, after releasing the object, there was a drop in voltage when relaxing the fingers and returning into original position.

Event no.LabelMeaning
Event-01tTouchFingers touching
Event-02tStartLoadPhaseStating to apply load
Event-03tReplaceReplacing the object
Event-04tReleaseReleasing fingers
Event-05____tLiftOffLifting the object
Event-06____tHandStartHand movement started
Event-07____Unexp.light weight liftUnexpected light weight
Event-08____Unexp.heavy weight liftUnexpected heavy weight
Event-09____LEDONLED turned On
Event-10____LEDOFFLED turned Off
Event-11____Expected Weight LiftExpected weight

Table 1.

Grasping events labels, (Meaning for EEGLAB plot).

Figure 3.

EEGLAB time-domain plot for (P1), and all 32 channels (Voltage(μV)vs. Time(Sec.)).

Figure 4.

EEGLAB time-domain plot for (P4), and all 32 channels (voltage (μV)vs. Time(Sec.)).

B) EXAMPLE, SECOND PARTICIPANT: Second, is the plot for the second person (called (P4) in the data) in a similar trial as the first person. This looks almost exact and similar as the first participant, except the blinking happened just before the LED turning off.

3.1.2. Frequency- analysis of grasping waves: power spectrum

After we are done with the time domain analysis, we will now examine the spectral power changes and the corresponding area of the brain that all the electrodes cover. This will indicate any movement and its origin within the human head. This is further shown in both Figures 5 and 6.

Figure 5.

EEGLAB frequency-domain power spectrum analysis, for (P1) for all (32 channels). ICA for the resulting grasping waves.

Figure 6.

EEGLAB frequency-domain power spectrum for the grasping trails, for (P4) for all (32 channels).

We should indicate that, while the data were recorded by (32) channel brain waves cap, such massive dataset are plotted in Figure 7. Figure 7 shows the time plot of only nine out of 32 channels that were used during the conduct of the experiment. The PCA and clustering operations were performed over the entire (11 individuals – nine trails – different objects – different weights), hence clustering operations were performed. This is further shown both in Figure 8.

Figure 7.

Plot of (P1) for (9 trials) by the individuals. Note: here the weight changes but the type is the same (sandpaper). Full time, same all-channels, and (9) trials (only weight change). (P1), Full time, all-channels, and (9) trials (series 1). (P1), Full time, all-channels, and (9) trials.

Figure 8.

Three regions of clustering patterns for second experimentation. Grasping EEG waves as being clustered into three main clusters. Each cluster indicates the focus of where similar EEG patterns are centered.Above-Left: Time domain signals during the course of grasping.Above-Right: Corresponding c-means clustering results. Four to Five main clusters have been identified for all the trails. Down-Left-Right: PCA analysis of the resulting EEG waves during the grasping task. Creating principle components does lead to observe similar waves, thus reducing the dimensionally of the EEG grasping data size.

FIRST PARTICIPANT, SECOND PARTICIPANT: In reference to the frequency spectrum during the grasping tasks, while looking at Figure 5 we can deduce that power changes accrued at (5.9 Hz) in theta band and corresponded to the frontal part of the brain which indicates relaxation as shown in previous studies. There were only minor changes at (9.8 Hz) in alpha band. Finally, there were big power changes in (F8, FC6, and T8) and minor to the counterpart at (22 Hz) in beta band side this could indicate the blinking artifact we saw earlier. We can see similar patterns in Figure 6 for the second person especially for blinking artifact in (22 Hz) in beta band. However, there is significant difference in at (9.8 Hz) which could be due to the finger or hand movement which will be investigated in the next section using PCA. For each real grasping experiment, it was found the identical and similar EEG patterns, that was detected by the locations of the clustered centers and gathered fuzzy membership function. In this context, the gathered fuzzy memberships obtained through the fuzzy c-mean clustered, do indicate the inherent knowledge about how the grasping was conducted. This knowledge is further decoded for generating the most suitable patterns of motorizing finger motion to be used for the robotic hand, or the human type prosthesis.

In reference to Table 2, the adopted dimensionality reduction via PCA, hence the clustering, have helped in identifying there main events related to this experiment which was performed during 6 seconds of time. Detecting EEG similar events during hand grasping is not an easy task to achieve. Such similar events do indicate how the real human behavior was done. Once this is done, this can be copied to a robotic system. Combining both PCA with clustering mechanism has definitely help the inherent and hidden features on how the different individuals performs grasping. PCA has proven the similar patterns, despite of the shapes of different waves.

TASK 1Event-1Event-1Overlapped Clusters
1Hand motionHand graspingHand graspingClusters 1,3
2Figures motionApply forceApply forceClusters 2,3
3Hand motionFinger moveForce releasingClusters 3,2

Table 2.

Grasping experimentation and patterns mapping and detected fuzzy clusters association.

The inherent features are almost the same, as seen via the eigenvalues of the individual experiments and the individual personals. In addition, fuzzy clustering has proven the overlapped shapes of the EEG waves during the same experiments, however, despite of such overlaps, the fuzzy clusters are able to detect the different grasping events that can be translated to a robotic hand. Within this chapter, we have presented a mechanism through which to analyze brainwaves EEG patterns for robotics and prosthesis use. The main discussed theme was based on using pattern recognition and clustering techniques for capturing the inherent patterns characterization during a phase of grasping. Capturing events and these characterizations are useful for transferring human knowledge (for the grasping tasks) to robotics hand, or prosthetics for complex grasping. In achieving that, a fuzzy clustering technique was able to identify similar events. The presented methodology was very useful in getting the intrinsic patterns. The presented approach also is efficient for mapping useful human thoughts, for robotics grasping tasks.


4. Conclusion

Electroencephalography and brain waves for robotics, human, computer, and machine interfaces (BMI, BCI), are evolving very fast. This is due to the fast development of high tech interfacing devices and computational methods. In this respect, the presented work is dedicated towards deep understanding of resulting electroencephalography (EEG) brainwaves during a typical grasp and lift human grasping task. During grasping, forces are applied by fingertips dexterously, as observed through resulting EEG waves. For mirroring this to a prosthesis-robotics and dextrous grasping applications, methods have to be developed to do mining and catch features for optimal forces, movements, and accurate finger joints displacements. Resulting EEG brainwaves during a grasp and lift task are very useful, however these EEG waves are related, correlated, complicated, and raw. With the potential and analysis of principal components analysis (PCA) of EEG, it indicated an overlap of valuable neural behaviors from various locations over the human skull, indicating interrelated and coupled events for robotic grasping. PCA has been used also to unlock few main features of EEG waves during the grasp and lift task. The foremost grasping features are hence used in creating accurate events for a robotic dexterous grasping.



This research work would have not been done without using accurate and real grasping EEG brainwaves. Grasping dataset was used with permission through the published materials of Luciw et al. [31]. Given this fact, the authors would like to express their appreciation to the Ume University, and in particular to Luciw et al. [31], for supplying the grasping EEG dataset.


  1. 1. Vincent V et al. Biomimetics: its Practice and Theory. 22nd August 2006. DOI: 10.1098/rsif.2006.0127
  2. 2. Andrew P, Harshavardhan A, José L. Contreras-Vidal: Decoding repetitive finger movements with brain activity acquired vi anon invasive electroencephalograph. Frontiers in Neuroengineering; 13th March 2014. DOI: 10.3389/fneng.2014.00003
  3. 3. Zaepffel M, Trachel R, Bjørg Elisabeth K, Brochier T. Modulations of EEG beta power during planning and execution of grasping movements. Plos One. March 2013;8(3):e60060
  4. 4. Sam B, Schwartz R, Holmes D. An EEG-based Brain Computer Interface for Rehabilitation and Restoration of Hand Control following Stroke Using Ipsilateral Cortical Physiology. 2014 Report, Washington University
  5. 5. Theodore B, Robert H, Dong S, Anushka G, Vasilis M, and Deadwyler S. A cortical neural prosthesis for restoring and enhancing memory. Journal of Neural Engineering August. 2011;8(4):046017. DOI:10.1088/1741-2560/8/4/046017
  6. 6. Yuanfang R, Yan W, Yanbin G. A co-training algorithm for EEG classification with biomimetic pattern recognition and sparse representation. Journal of Neurocomputing. 2014;137:212-222
  7. 7. Perruchoud D, Pisotta J, Carda S, Murray M, Ionta S. Biomimetic rehabilitation engineering: The importance of somatosensory feedback for brain–machine interfaces. Journal of Neural Engineering. 2016;13, 041001:9
  8. 8. Menniti D, Pullano S, Bianco M, Citraro R, Russo E, De Sarro G, Fiorillo A. Biomimetic Sonar for Electrical Activation of the Auditory Pathway. Hindawi Journal of Sensors. 2017;2017. Article ID 2632178
  9. 9. Bullock I, Zheng J, Rosa S, Guertler C, Dollar A. Grasp frequency and usage in daily household and machine shop tasks. IEEE Transactions on Haptics. 2013;6(3):296-308
  10. 10. Bullock M, Feix T, Dollar M. Finding small, versatile sets of human grasps to span common objects. Robotics and Automation (ICRA), IEEE International Conference on, Karlsruhe, 6–10 May 2013, 2013. pp. 1068-1075
  11. 11. Bullock M, Feix T, Dollar M. The Yale human grasping dataset: Grasp, object, and task data in household and machine shop environments. The International Journal of Robotics Research. 2014;34(3):251-255
  12. 12. Feix T, Bullock L, Dollar A. Analysis of human grasping behavior: Object characteristics and grasp type. IEEE Transactions on Haptics. 2014;7(3):311-323
  13. 13. Khorshidtalab A, Salami M, Hamedi M. Robust classification of motor imagery EEG signals using statistical time–domain features. Journal of Physiol Measurements. 2013;34:1563-1579
  14. 14. Mao X, Mengfan L, Wei L, Niu L, Xian B, Zeng M, Chen G. Progress in EEG-based brain robot interaction systems. Hindawi Computational Intelligence and Neuroscience Journal. 2017;2017:1-25. Article ID 1742862.
  15. 15. Feix T, Bullock I, Dollar A. Analysis of human grasping behavior: Correlating tasks, objects and grasps. IEEE Transactions on Haptics. 2014;7(4):430-441
  16. 16. Subasi A, Ismail Gursoy M. EEG signal classification using PCA, ICA, LDA and support vector machines. Journal of Expert Systems with Applications. 2010;37:8659-8666
  17. 17. Al-Qazzaz N, Bin Mohd Ali S, Anom S, Islam A, Escudero J. Automatic artifact removal in EEG of normal and demented individuals using ICA–WT during working memory tasks. Journal of Sensors. 2017;17:1326. DOI: 10.3390/s17061326
  18. 18. Yoshioka M, Zhu C, Imamura K, Wang F, Yu H, Duan F, Yan Y. Experimental design and signal selection for construction of a robot control system based on EEG signals. 2014, in Yoshioka et al. Robotics and Biomimetics. 2014;1:22
  19. 19. Hazrati M, Erfanian A. An online EEG-based brain–computer interface for controlling hand grasp using an adaptive probabilistic neural network. Journal of Medical Engineering and Physics. 2010;32(7):730-739
  20. 20. Jianjun M, Shuying Z, Angeliki B, Jaron O, Bryan B, Bin H. Noninvasive electroencephalogram based control of a robotic arm for reach and grasp tasks. Scientific Reports. 14th of December 2016;6:38565. DOI: 10.1038/srep38565
  21. 21. Vinet R, Lozac’h Y, Beaundry N, Drouin G. Design methodology for a multifunctional hand prosthesis. Journal of Rehabilitation Research and Development. 1995;32:316-324
  22. 22. Cutkosky M. Robotic Grasping and Fine Manipulation. Boston: Kluwer Academic Publishers; 1985
  23. 23. Mattar E, Al-Junaid H. Fuzzy C-Means Classification of Electroencephalography (EEG) Waves for Robotic System Time Events and Control. Conference: 2016, the 14th Pacific Rim International Conference on Artificial Intelligence, atPRICAI 2016, Thailand, vol. 1, ISBN 978-616-92700-1-0, pp. 22-26
  24. 24. Dustin J. Tyler: Creating a Prosthetic Hand That Can Feel. Publication of the IEEE Spectrum. April 2016
  25. 25. Touch Bionics: A Story of Innovation and Growth Powered by Co-investment. Available from: [Accessed: 2017-11-10]
  26. 26. Subasi A, Gursoy M. EEG signal classification using PCA, ICA, LDA and support vector machines. Expert Systems with Applications. 2010;37(12):8659-8666
  27. 27. Wang C, Zou J, Zhang J, Wang M, Wang R. Feature extraction and recognition of epileptiform activity in EEG by combining PCA with ApEn. Cognitive Neurodynamics. 2010;4(3):233-240
  28. 28. Mahajan K, Vargantwar M, Rajput S. Classification of EEG using PCA, ICA and neural network. International Journal of Engineering and Advanced Technology. 2011;1(1):80-83
  29. 29. Aminian F. Electroencephalogram (EEG) signal classification using neural networks with wavelet packet analysis, principal component analysis and data normalization as preprocessors. In: Vrajitoru D, editor. Proceedings of the Twenty-First MAICS 2010 Midwest Artificial Intelligence and Cognitive Science Conference (pp. 55-62). South Bend, IN: Midwest Artificial Intelligence and Cognitive Science Conference; 2010
  30. 30. Jin J, Wang X, Wang B. Classification of Direction Perception EEG Based on PCA-SVM. 2007, Natural Computation. ICNC 2007. 3rd International Conference on, Haikou, pp. 116-120
  31. 31. Luciw M, Jarocka E, Edin B. Multi-channel EEG recordings during 3,936 grasp and lift trials with varying weight and friction. Scientific Data. 2014, article number: 140047. pp. 1-11. DOI: 10.1038/sdata, 2014. 47
  32. 32. Delorme A, Makeig S. EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Elsevier, Journal of Neuroscience Methods. 2004;134:9-21

Written By

Ebrahim A. Mattar, Hessa J. Al-Junaid and Hamad H. Al-Seddiqi

Submitted: July 3rd, 2017 Reviewed: November 14th, 2017 Published: December 30th, 2017