Open access peer-reviewed chapter

Entropy and the Emotional Brain: Overview of a Research Field

Written By

Beatriz García-Martínez, Antonio Fernández-Caballero and Arturo Martínez-Rodrigo

Reviewed: 11 May 2021 Published: 09 June 2021

DOI: 10.5772/intechopen.98342

From the Edited Volume

Brain-Computer Interface

Edited by Vahid Asadpour

Chapter metrics overview

401 Chapter Downloads

View Full Metrics

Abstract

During the last years, there has been a notable increase in the number of studies focused on the assessment of brain dynamics for the recognition of emotional states by means of nonlinear methodologies. More precisely, different entropy metrics have been applied for the analysis of electroencephalographic recordings for the detection of emotions. In this sense, regularity-based entropy metrics, symbolic predictability-based entropy indices, and different multiscale and multilag variants of the aforementioned methods have been successfully tested in a series of studies for emotion recognition from the EEG recording. This chapter aims to unify all those contributions to this scientific area, summarizing the main discoverings recently achieved in this research field.

Keywords

  • Emotion recognition
  • Electroencephalographic recordings
  • Entropy metrics
  • Nonlinear analysis
  • Survey

1. Introduction

Emotions are essential in our daily lives, with an enormous repercussion on perception, cognition, learning and rational decision-making processes [1]. As a result, the affective neuroscience has emerged with the purpose of studying the influence of emotions on areas like psychology, philosophy or neurobiology, among many others [1]. The emotional states defined in the literature range from a few basic emotions [2] to several complex emotions created as combinations of the basics [3]. These emotional states can be classified according to different models, being the circumplex model of Russell one of the most widely used [4]. This bidimensional model distributes all the existing emotional states according to two emotional parameters, namely valence and arousal. Valence represents the degree of pleasantness or unpleasantness produced by an emotional stimulus, whereas arousal measures the activation or deactivation that a stimulus provokes. The location of each emotional state in the circumplex model is determined by its level of both dimensions, as shown in Figure 1.

Figure 1.

Circumplex model of Russell for classification of emotions based on their level of valence and arousal.

Emotions also play a key role in human communication and interaction processes. Nevertheless, the human-machine interfaces (HMIs) are still not able to identify human emotional states. In a digital society in which those systems are daily introduced in multiple ordinary scenarios, it becomes crucial to supplement this lack of emotional intelligence of HMIs. In this sense, the aim of the Affective Computing science is to endow those systems with the capability to automatically detect and interpret human emotions and decide which actions to execute accordingly, thus improving the interactions between people and machines [5, 6].

The detection of emotional states can be conducted by means of the assessment of bodily reactions to emotional stimuli, for which different physiological variables can be measured and analyzed. One of the most widely studied in the last years is the electroencephalography (EEG), which represents the electrical activity generated in the brain due to neural connections [7]. The selection of EEG recordings instead of other physiological signals is justified by the fact that the brain generates the first impulse against any stimulus, and then it is spread to the rest of peripheral systems through the central nervous system. In this sense, EEG signals represent the activity of the source of the emotional response, whereas the rest of physiological variables can be considered as secondary effects of the brain’s performance [8]. As a consequence, the number of works focused on the analysis of EEG time series for emotions detection has notably increased in the last years [9].

The evaluation of EEG recordings has been traditionally conducted from a linear perspective, especially in the frequency domain, studying features such as the spectral power or the asymmetry between the two brain hemispheres in different frequency bands [10]. However, the brain activity is far from being considered linear. Contrarily, neural processes follow a completely heterogeneous and nonstationary performance even at both cellular and global level [11]. With this respect, the application of linear algorithms may not report a complete description of the brain’s behavior [12]. For this reason, nonlinear methodologies have been widely applied for discovering underlying information unrevealed by traditional linear techniques [13]. Indeed, nonlinear indices have already outperformed the results derived from the application of those linear algorithms for the evaluation of various mental processes, including the recognition of emotions [13].

Among the different nonlinear methodologies that can be found in the literature, entropy indices have been widely applied in the context of emotions recognition with EEG recordings [14]. Entropy represents the rate of information reported by a time series, describing the nonlinear characteristics of a nonstationary system [15]. Hence, entropy metrics become promising tools for the assessment of the chaotic dynamics of a nonstationary system such as the brain. Indeed, in the literature many studies have applied these nonlinear methodologies for the identification of emotional states from EEG recordings. The present manuscript summarizes the main discoverings of the last years in the scientific field of emotions recognition from EEG signals with entropy indices.

Advertisement

2. Entropy indices

Entropy was firstly defined in thermodynamics, referring to the distribution probability of molecules in a fluid system [15]. In information theory and signals analysis, this concept was adapted by Shannon, who defined entropy as a measure of the information provided by a time series, describing its complexity, irregularity or unpredictability [16]. With respect to the EEG analysis, many entropy indices have been introduced and successfully applied for the study of various physical and mental disorders, like epilepsy [17], Alzheimer [18], autism [19] or depression [20], among others. As a result of the valuable outcomes, entropy metrics have also been introduced in the research field of emotions recognition from EEG recordings [14]. The following subsections give a brief mathematical description of the entropy metrics mainly applied for emotions detection.

2.1 Regularity-based entropy indices

The irregularity of a signal represents the rate of repetitiveness of patterns, reaching higher values for non-repetitive and disordered time series, and lower for sequences with a high rate of occurrence [21]. One of the regularity-based entropy metrics widely used is the approximate entropy (ApEn), which evaluates the probability of having repetitive patterns and assigns a non-negative number to each sequence in terms of its repetitiveness, with lower values for more recurrent patterns [22]. Mathematically, ApEn is computed as

ApEnmr=CmrCm+1r,E1

where Cmr and Cm+1r are the correlation integrals that represent the likelihood of having two sequences matching for m and for m+1 points, respectively, within the threshold r [22]. Nevertheless, ApEn also considers the self-matching of each sequence, thus influencing on the final result. Therefore, the sample entropy (SampEn) was developed to address this issue [23]. SampEn eliminates the self-matching and makes results independent of the value of the vector length m selected. If Bmr is the probability that two patterns match for m points, and Bm+1r is the probability that two patterns match for m+1 points, defined to exclude self-matches, then SampEn is calculated as

SampEnmr=lnBm+1rBmr.E2

Moreover, quadratic sample entropy (QSampEn) emerged as an improvement of SampEn to make it insensitive to the value of the threshold r chosen for its calculation [24]. This independence of the parameter r is achieved by simply adding the term ln2r to the SampEn equation:

QSampEnmr=SampEnmrN+ln2r.E3

2.2 Predictability-based and symbolic entropy indices

The predictability of a nonstationary system is related to its stable and deterministic evolution in time. Most of the entropy metrics for predictability measurement are symbolic indices that convert the original signal into a sequence of discrete symbols to form sequences [25]. After this symbolization, the evaluation of the predictability of a time series can be carried out with multiple techniques. The most commonly used is the Shannon entropy (ShEn), which quantifies the predictability of a signal in terms of the probability distribution of its amplitudes [16]. The mathematical expression of ShEn is

ShEnm=i=1mpxilnpxi,E4

being pxi the probability of appearance of each symbolic sequence xi of length m.

The Rényi entropy (REn) is a generalization of ShEn that is also widely used for the quantification of underlying dynamics in symbolized signals [26]. Concretely, REn provides a better characterization of some rare and frequent ordinal sequences, and it is defined as

REnmq=11qlni=1mpxiq,E5

being q (q0 and q1) the bias parameter that enables a more accurate characterization of a nonlinear signal [26]. Indeed, ShEn is the particular case of REn for q=1, thus being REn a more flexible index than ShEn [26].

The version of ShEn for continuous random variables, called differential entropy (DEn), has received growing interest in the last years [27]. This entropy index can be expressed as

DEnX=XfxlogfxdxE6

where X is a random signal and fx is its probability density function. In the case of time series governed by the Gauss distribution Nμσ2, being μ and σ its mean and variance, respectively, DEn can be defined as

DEnX=12πσ2exμ22σ2log12πσ2exμ22σ2dx=12log2πeσ2.E7

As EEG signals follow a Gaussian distribution after the application of a band-pass filtering approach, the DEn of each sub-frequency band previously obtained by a Fast Fourier transform can be obtained according to the aforementioned equation [27].

Another widely used predictability-based entropy metric is permutation entropy (PerEn), which is a fast and insensitive to noise metric that evaluates the order of the symbols within a pattern [28]. Briefly, the original time series is symbolized to obtain ordinal sequences xi that are associated with m! permutation patterns κi. Considering pπk as the probability of apperarance of each permutation pattern, PerEn can be then computed by means of ShEn:

PerEnm=1lnm!k=1m!pπklnpπk.E8

One of the limitations of PerEn is that it only considers the order of the symbols in a pattern, without taking into account their amplitudes. This limitation has been recently solved by means of the introduction of amplitude-aware permutation entropy (AAPE) [29]. This improvement of PerEn computes the probability pπk of appearance of patterns evaluating the average absolute and relative amplitudes of the symbolic sequences, and applying an adjustment coefficient to weight those parameters. Finally, AAPE is calculated in a similar manner as PerEn [29]:

AAPEm=1lnm!k=1m!pπklnpπk.E9

Another option for the assessment of predictability of a time series is the spectral entropy (SpEn) [30]. In this case, the spectral power of a determined frequency is computed and normalized with respect to the total power, which gives a probability density function pf. SpEn is then computed through ShEn or REn [30].

2.3 Multilag and multiscale entropy indices

The time series generated by nonlinear and nonstationary systems like the brain usually present highly complex dynamics derived from different simultaneous mechanisms that operate in multiple time scales [31]. As a result, the brain behavior cannot be completely described by means of single-scale methods. Therefore, multiscale variations of the aforementioned entropy metrics have been introduced with the purpose of revealing undiscovered information related to the multiscale nature of the EEG recordings. For the computation of multiscale entropy (MSE), the original signal xn is firstly decomposed into coarse-grained time series yκ, with κ as the scale factor, as follows:

yκ=1κi=j1κ+1xi,1jNκ.E10

Therefore, all the previously defined entropy indices can be computed in a multiscale form for the coarse-grained series as defined above.

Another multiscale option is the wavelet entropy (WEn), which makes use of the decomposition of the original signal in different scales by means of the wavelet transform [32]. After the decomposition of the time series, the probability distribution pj of the energy at each decomposition level is computed. Finally, WEn is estimated through ShEn:

WEn=j=1qpjlnpj.E11

On the other hand, the characteristics of the autocorrelation function of some signals require the consideration of a lag or time delay τ for a correct quantification of the complexity and nonlinear dynamics of the time series. In this sense, multilag entropy approaches are helpful to reduce the influence of the autocorrelation function for a proper characterization of a nonlinear signal [33]. One of those multilag approaches is the permutation min-entropy (PerMin), which is a symbolic time-delayed improvement of PerEn [34]. Starting from a generalization of PerEn through replacing ShEn by REn, the Rényi permutation entropy (RPE) is obtained as

RPEmqτ=1lnm!11qlnk=1m!pτπkq.E12

Finally, PerMin is obtained in the limit q and presents the following expression:

PerMinmτ=1lnm!lnmaxk=1,2,,m!pτπk.E13
Advertisement

3. Literature overview

In the literature, there are various studies that have applied regularity-based entropy measures for emotions detection with EEG recordings. A brief summary of those works is presented in Table 1, with information about the year of publication, the experimental design (including the number of emotions detected, subjects, EEG channels and type of stimulus), the features extracted, the classification models implemented, and the results obtained in each case. As can be observed, the interest in these metrics started growing in 2014, especially computing ApEn and SampEn for the detection of a number of emotions ranging between 2 and 4. In many cases, the signals analyzed were extracted from the publicly available Database for Emotion Analysis using Physiological Signals (DEAP), which consisted on a total of 32 healthy subjects watching emotional videoclips during the registration of their EEG with 32 channels [47]. Hence, different studies tested their methods on the same EEG recordings, thus allowing a direct comparison of the results obtained [37, 38, 39, 41, 42, 48]. The rest of works had different experimental designs. In terms of the classification models, support vector machines (SVM) were selected in most of the cases [35, 37, 38, 40, 43, 44, 45]. The outcomes derived from these studies presented a classification accuracy (Acc) ranging between 73% and 95%, being the frontal and parietal brain regions the most relevant for the detection of emotional states with these regularity-based entropies.

Ref. (Year)Experimental designFeaturesClassifierResults
[35] (2011)2 emotions, 15 subjects, 5 EEG channels, imagesApEn + othersSVM1Acc = 73.25%
[36] (2014)Depression, 60 subjects, 24 EEG channels, eyes open/closed, no stimuliApEn + othersHigher irregularity for healthy than for depressed
[37] (2014)4 emotions, DEAP2SampEnSVMAcc = 80.43%
[38] (2016)4 emotions, DEAPSampEn + othersSVMAcc = 94.98% (two classes) and 93.20% (four classes)
[39] (2016)2 emotions, DEAPSampEn, QSampEn + othersDT3Acc = 75.29%
[40] (2017)3 emotions, 44 subjects, 31 EEG channels, imagesApEn + othersSVMAcc = 75.5%
[41] (2018)Valence and arousal, DEAPSampEn + othersMLP4, DST5Acc = 87.43% (arousal) and 88.74% (valence)
[42] (2018)4 emotions, DEAPSampEn + othersPSAE6Acc = 93.6%
[43] (2018)4 emotions, 10 subjects, 14 EEG channels, film clipsApEn + othersSVM, DBN7Acc = 87.32%
[44] (2019)4 emotions, 8 subjects, 12 EEG channels, musicApEn, SampEn + othersSVM, C4.5, LDA8Acc = 84.91% (valence) and 89.65% (arousal)
[45] (2020)3 emotions, SEED9DySampEn10SVMAcc = 84.67%
[46] (2021)2 emotions, DEAPCSampEn11More coordination in parietal and occipital

Table 1.

Studies of emotions recognition from EEG recordings with regularity-based entropy indices.

SVM: Support vector machine.


DEAP: Database for emotion analysis using physiological signals.


DT: Decision tree.


MLP: Multi-layer perceptron.


DST: Dempster-Shafer theory.


PSAE: Parallel stacked autoencoders.


DBN: Deep belief networks.


LDA: Linear discriminant analysis.


SEED: SJTU emotion EEG database.


DySampEn: Dynamic SampEn.


CSampEn: Cross-sample entropy.


On the other hand, the predictability-based entropy indices have been the most applied for the assessment of different emotional states from EEG recordings. The main studies that have used these entropy metrics for that purpose are included in Tables 2 and 3. It can be observed that only a few works studied these indices between 2011 and 2015 [49, 50, 51, 52, 63]. Nevertheless, the interest in these predictability measures has notably increased since 2017 until nowadays. More precisely, DEn is the predictability-based metric that has gained a considerable visibility since 2018, thus Table 3 only includes studies based on the application of DEn. The rest of predictability-based entropy metrics, i.e. ShEn, REn, SpEn, and permutation indices, are contained in Table 2. It is interesting to note that the majority of these works in Table 2 analyzed the EEG signals contained in the DEAP database, whereas only a few tested those metrics with different experiments. As for the regularity-based indices, the number of emotions identified in works in Table 2 ranged from 2 to 4, and only one study recognized 5 emotional states [49]. In terms of the classifiers implemented, SVM approaches were preferred over other models in the majority of the studies. The results obtained presented inconsistent Acc values, ranging from 65–99%, being the frontal and parietal/occipital lobes the most relevant in emotional processes. With respect to the studies in Table 3, it can be noticed that the majority of them followed the same experimental procedure, but in this case, another public dataset different from DEAP was selected. Indeed, recordings from the SJTU Emotion EEG Dataset (SEED) were chosen and assessed for the detection of three emotional states, namely positive, neutral and negative [63]. This database contained EEG recordings with 62 channels from 15 subjects during the visualization of film clips with emotional content [63]. The selection of the classification approaches was quite inconsistent across the different studies. However, it can be observed that deep learning approaches like convolutional neural networks (CNN) have been progressively introduced in the literature for their application in emotion recognition researches. The accuracy results reported in these works were between 68% and 99%. Furthermore, some of them demonstrated that DEn was more suitable than some linear metrics for the identification of emotions [63, 68].

Ref. (Year)Experimental designFeaturesClassifierResults
[49] (2011)5 emotions, 20 subjects, 24 and 62 EEG channels, videosShEn, SP1LDA, KNN2Entropy better than linear. Acc = 83.04% with 62 channels
[50] (2013)Stress, 13 subjects, 3 EEG channels, eyes closed, no stimuliREn + othersANOVA3Lower complexity in stress than in calmness
[51] (2015)4 emotions, 8 subjects, 20 EEG channels, audiovisual stimuliShEn, REnMC-LSSVM4Acc = 84.79%
[52] (2015)4 emotions, 25 subjects, 3 EEG channels, musicSpEnSVM, KNN, CFNN5Acc = 93.66% (valence) and 93.29% (arousal)
[53] (2017)2 emotions, DEAPPerEn, AAPE, QSampEnSVMAcc = 81.31%
[54] (2017)Valence and arousal, DEAPSpEn, ShEnLSSVM6, D-RFE7Acc = 78.96% (arousal) and 71.43% (valence)
[55] (2017)4 emotions, DEAPSpEn, ShEn + othersThree-stage decision methodAcc = 86.67%
[56] (2018)Arousal and valence, DEAPSpEn, spectral and statisticsSVM, KNN, NB8Spectral and statistics better than SpEn
[57] (2018)2, 3, 4 and 5 emotions, DEAPREn + othersSVMAcc = 73.8–86.2%
[58] (2018)Depression, 213 subjects, 3 EEG channels, soundsShEn, SpEn + othersKNNAcc = 79.27%
[59] (2019)4 emotions, DEAPShEn, SpEn + othersLSSVMAcc = 65.13%
[60] (2019)4 emotions, DEAPShEn, PerEn + othersSVMBest results with PerEn
[61] (2020)2 emotions, DEAPCEn9, QSampEnSVMAcc = 80.31%
[62] (2020)4 emotions, DEAPSpEn, ShEn + othersSVM, NB, ANN10Acc = 98.7% with ANN
[48] (2021)4 emotions, DEAPAAPE, PerMin + othersSVMAcc = 96.39%

Table 2.

Studies of emotions recognition from EEG recordings with predictability-based entropy indices (ShEn, SpEn, REn, PerEn, AAPE.

SP: Spectral power.


KNN: K-nearest neightbor.


ANOVA: Analysis of variance.


MC-LSSVM: Multiclass least-square support vector machine.


CFNN: Cascade-forward neural network.


LSSVM: Least-square support vector machine.


D-RFE: Dynamical recursive feature elimination.


NB: Naive Bayes.


CEn: Conditional entropy.


ANN: Artificial neural network.


Ref. (Year)Experimental designFeaturesClassifierResults
[63] (2015)3 emotions, SEEDDEn, SP, statisticsDBNDEn better than SP and statistics. Acc = 85%
[64] (2018)4 emotions, two experiments: DEAP and SEEDDEn + othersGELM1Acc = 69.67% (DEAP) and 91.07% (SEED)
[65] (2018)3 emotions, SEEDDEnHCNN2Best results in β and γ
[66] (2018)3 emotions, 14 subjects, 64 EEG channels, film clipsDEn + othersGRSLR3Acc = 80.27%
[67] (2018)3 emotions, two experiments: DEAP and SEEDDEnBest results with SEED database
[68] (2018)3 emotions, SEEDDEn, SP, statisticsDGCNN4DEn better than SP and statistics. Acc = 90%
[69] (2019)3 emotions, SEEDDEnLDAAcc = 68%
[70] (2019)2 emotions in patients with disorder of conciousness, 18 subjects, 32 EEG channels, videosDEnSVMAcc = 91.5%
[71] (2019)3 emotions, SEEDDEnSTNN5Acc = 84.16%
[72] (2019)3 emotions, SEEDDEnLR6Acc = 86%
[73] (2020)3 emotions, SEEDDEnMTL7Acc = 88.92%
[74] (2020)3 emotions, SEEDDEnCNNAcc = 90.63%
[75] (2020)3 emotions, SEEDDEnCNNAcc = 90.41%
[76] (2020)High-low valence and arousal, DEAPDEn + othersLORSAL8Acc = 77.17%
[77] (2020)3 emotions, SEEDDEn + othersCNNAcc = 99.7%
[78] (2020)3 emotions, SEEDDEn + othersSRU9Acc = 83.13%
[79] (2021)Valence and arousal, DEAPDEnCNNAcc = 90.45% (valence) and 90.6% (arousal)

Table 3.

Studies of emotions recognition from EEG recordings with predictability-based entropy indices (DEn).

GELM: Graph-regularized extreme learning machine.


HCNN: Hierarchical convolutional neural network.


GRSLR: Graph regularized sparse linear regression.


DGCNN: Dynamical graph convolutional neural network.


STNN: Spatial–temporal neural network.


LR: Linear regression.


MTL: Multisource transfer learning.


LORSAL: Logistic regression via variable splitting and augmented Lagragian.


SRU: Simple recurrent unit network.


Finally, Table 4 shows the main works focused on the application of multiscale and multilag entropy approaches for detecting emotions with EEG recordings. As can be observed, the study of these indices has emerged in the last few years, especially since 2019 until nowadays. In these works, the number of emotional states studied ranged from 2 to 5, which is in line with the rest of entropy metrics evaluated. Furthermore, the signals from the DEAP database were also chosen by some of the studies included in the table. It can be noticed that the selection of the classification models was slightly inconsistent among the different studies, although SVM and deep learning approaches were the most selected. As in the previous cases, the final outcomes obtained presented accuracy values with a high variability, ranging from 73–98%.

Ref. (Year)Experimental designFeaturesClassifierResults
[80] (2016)5 emotions, 30 subjects, 6 EEG channels, video clipsMMSampEn1Higher irregularity for higher arousal levels
[81] (2019)2 emotions, DEAPCMQSampEn2, CMAAPE3SVM, DTAcc = 86.35%
[82] (2019)4 emotions, DEAPWEnSVM, FCM4Acc = 73.32%
[83] (2019)2 emotions, DEAPPerMin, DPerEn5KNNAcc = 92.32%
[84] (2020)3 emotions, 10 subjects, 14 EEG channels, video clipsWEnANNAcc = 98%
[85] (2021)3 emotions, SEEDMSpEn6 + othersARF7Acc = 94.4%
[86] (2021)Enjoyment, 28 subjects, 8 EEG channels, art piecesMSampEnSVM, RVM8Acc = 91.18%

Table 4.

Studies of emotions recognition from EEG recordings with multiscale and multilag entropy indices.

MMSampEn: Multivariate-multiscale SampEn.


CMQSampEn: Composite multiscale QSampEn.


CMAAPE: Composite multiscale AAPE.


FCM: Fuzzy cognitive map.


DPerEn: Delayed PerEn.


MSpEn: Multiscale SpEn.


ARF: Autoencoder based random forest.


RVM: Relevance vector machine.


Advertisement

4. General findings

The application of entropy metrics for the recognition of emotions from EEG signals has received increasing attention in the last years, reporting valuable insights about the brain’s performance under different emotional conditions. However, the high variability of the results obtained could be justified by various aspects. On the one hand, the experimentation is different for each study, since there are no gold standards of experimental procedures. In this sense, the number of participants and their gender, age, or cultural characteristics, are very incosistent among studies, thus the results may not be representative of the whole population. In addition, the type of stimulus used for emotions elicitation (images, sounds, videos, etc.) is also inconsistent, since there is no consensus about which is the optimum option for triggering a strong emotional response [87]. The duration of the stimulus is another unclear point, thus different criteria are followed by each research group. Finally, although the locations of EEG electrodes are standardized, the number of EEG channels recorded is different in each experiment, ranging from 3 to 64. Moreover, some works assessed the signals corresponding to only one brain area, thus discarding the information that could be reported by the rest of regions.

All those experimental differences could bias the possibility of obtaining universal results that could be generalized to the whole population. As a consequence of all those discrepancies between experimental procedures, the studies presented in this manuscript should be carefully interpreted and compared. In addition, as not all the publications give a thorough description of their methodology, their experiments could not be reproduced by other research groups. Therefore, the assessment of signals extracted from publicly available databases, like DEAP or SEED, could eliminate this limitation, since the experimental procedure would be the same for different authors. In this sense, the reproducibility and comparability of the results obtained would be guaranteed, and the differences in the outcomes would directly appear due to the diversity of analysis methods and classification approaches.

The variability of the results could also be a consequence of the intrinsic differences of the entropy metrics evaluated. Indeed, regularity-based, predictability-based, and multiscale/multilag approaches evaluate the complexity of time series from different perspectives. Therefore, the application of either one or other type of entropy index on the same problem could report completely divergent outcomes. Nevertheless, instead of considering these inequalities as contradictory, it should be regarded as a sign of complementarity between the different entropy metrics. For instance, some characteristics of a nonlinear signal could be properly assessed with regularity-based entropies, and other dynamics would be better described by predictability and symbolic entropy measures. Consequently, the selection of either one or other type of entropy index should be done taking into account the information that is wanted to be extracted from a nonstationary time series, also considering that the combination of different entropies would report a more complete description of the nonlinear processes.

The promising outcomes presented in these studies make the entropy metrics a useful tool for the recognition of emotions from EEG recordings. However, the majority of the works are mainly focused on obtaining great classification accuracy values, for which advanced classification models with hundreds of input features are implemented. Despite providing notable numerical results in many cases, the combination of such a large amount of data in complex classification schemes derives in a total loss of clinical interpretation of the results. In this regard, information about which are the most relevant brain regions, or which EEG channels do a higher contribution to the classification model, cannot be obtained. Thus, it becomes impossible to make a thorough analysis of the brain’s behavior under the emotional states detected. As a result, it would be interesting to modify some methodological aspects in this kind of studies in order to ensure the clinical interpretation of the results and reveal new insights about mental processes under emotional conditions.

Advertisement

5. Conclusions

Given the nonlinear and nonstationary nature of the brain, entropy indices are suitable tools for a complete description of the brain dynamics in different scenarios, including the recognition of emotional states. This chapter summarizes the main recent contributions to the research field of emotions detection through the application of entropy indices for the analysis of EEG recordings. In this sense, regularity-based, predictability-based, and multiscale/multilag entropy approaches have demonstrated their capability to discern between different emotional states and discover new insights about the brain dynamics in emotional processes. Taking into account the valuable results obtained in the studies presented in this chapter, entropy metrics could become one of the first options to be considered in systems for automatic emotions identification from EEG signals.

Advertisement

Acknowledgments

This work was partially supported by Spanish Ministerio de Ciencia, Innovación y Universidades, Agencia Estatal de Investigación (AEI) /European Regional Development Fund (FEDER, UE) under EQC2019-006063-P, PID2020-115220RB-C21, and 2018/11744 grants, and by Biomedical Research Networking Centre in Mental Health (CIBERSAM) of the Instituto de Salud Carlos III. Beatriz García-Martínez holds FPU16/03740 scholarship from Spanish Ministerio de Educación y Formación Profesional.

Advertisement

Conflict of interest

The authors declare no conflict of interest.

References

  1. 1. Susi Ferrarello. Human emotions and the origins of bioethics. Routledge, 2021
  2. 2. Paul Ekman. An argument for basic emotions. Cognition and Emotion, 6(3–4): 169–200, 1992
  3. 3. M Schröder and R Cowie. Toward emotion-sensitive multimodal interfaces: The challenge of the European Network of Excellence HUMAINE, 2005
  4. 4. James A Russell. A circumplex model of affect. Journal of Personality and Social Psychology, 39(6):1161–1178, 1980
  5. 5. Soujanya Poria, Erik Cambria, Rajiv Bajpai, and Amir Hussain. A review of affective computing: From unimodal analysis to multimodal fusion. Information Fusion, 37:98–125, 2017
  6. 6. Jing Han, Zixing Zhang, and Bjorn Schuller. Adversarial training in affective computing and sentiment analysis: Recent advances and perspectives. IEEE Computational Intelligence Magazine, 14(2):68–81, 2019
  7. 7. Soraia M Alarcao and Manuel J Fonseca. Emotions recognition using EEG signals: A survey. IEEE Transactions on Affective Computing, 10 (3):374–393, 2017
  8. 8. Maria Egger, Matthias Ley, and Sten Hanke. Emotion recognition from physiological signal analysis: A review. Electronic Notes in Theoretical Computer Science, 343:35–55, 2019
  9. 9. Andrius Dzedzickis, Artūras Kaklauskas, and Vytautas Bucinskas. Human emotion recognition: Review of sensors and methods. Sensors, 20(3): 592, 2020
  10. 10. Min-Ki Kim, Miyoung Kim, Eunmi Oh, and Sung-Phil Kim. A review on the computational methods for emotional state estimation from the human EEG. Computational and Mathematical Methods in Medicine, 2013: 573734, 2013
  11. 11. Yuzhen Cao, Lihui Cai, Jiang Wang, Ruofan Wang, Haitao Yu, Yibin Cao, and Jing Liu. Characterization of complexity in the electroencephalograph activity of Alzheimer’s disease based on fuzzy entropy. Chaos, 25(8):083116, 2015
  12. 12. Mona Farokhzadi, Gholam-Ali Hossein-Zadeh, and Hamid Soltanian-Zadeh. Nonlinear effective connectivity measure based on adaptive neuro fuzzy inference system and Granger causality. Neuroimage, 181: 382–394, 2018
  13. 13. Germán Rodríguez-Bermúdez and Pedro J Garcia-Laencina. Analysis of EEG signals using nonlinear dynamics and chaos: A review. Applied Mathematics & Information Sciences, 9(5): 2309, 2015
  14. 14. B. García-Martínez, A. Martinez-Rodrigo, R. Alcaraz, and A. Fernández-Caballero. A review on nonlinear methods using electroencephalographic recordings for emotion recognition. IEEE Transactions on Affective Computing, page 1, 2019
  15. 15. Berthold Bein. Entropy. Best Practice and Research Clinical Anaesthesiology, 20:101–109, 2006
  16. 16. Claude E Shannon. A mathematical theory of communication. The Bell System Technical Journal, 27:623–656, 1948
  17. 17. Peng Li, Chandan Karmakar, John Yearwood, Svetha Venkatesh, Marimuthu Palaniswami, and Changchun Liu. Detection of epileptic seizure based on entropy analysis of short-term EEG. PloS one, 13(3): e0193691, 2018
  18. 18. Hamed Azami, Daniel Abásolo, Samantha Simons, and Javier Escudero. Univariate and multivariate generalized multiscale entropy to characterise EEG signals in Alzheimer’s disease. Entropy, 19(1):31, 2017
  19. 19. Jiannan Kang, Huimin Chen, Xin Li, and Xiaoli Li. EEG entropy analysis in autistic children. Journal of Clinical Neuroscience, 62:199–206, 2019
  20. 20. Reza Shalbaf, Colleen Brenner, Christopher Pang, Daniel M Blumberger, Jonathan Downar, Zafiris J Daskalakis, Joseph Tham, Raymond W Lam, Faranak Farzan, and Fidel Vila-Rodriguez. Non-linear entropy analysis in EEG to predict treatment response to repetitive transcranial magnetic stimulation in depression. Frontiers in Pharmacology, 9:1188, 2018
  21. 21. Holger Kantz and Thomas Schreiber. Nonlinear time series analysis. Cambridge University Press, 2003
  22. 22. S M Pincus. Approximate entropy as a measure of system complexity. Proceedings of the National Academy of Sciences of the United States of America, 88(6):2297–2301, 1991
  23. 23. JS Richman and JR Moorman. Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology. Heart and Circulatory Physiology, 278(6): H2039–H2049, 2000
  24. 24. Douglas E Lake and J Randall Moorman. Accurate estimation of entropy in very short physiological time series: The problem of atrial fibrillation detection in implanted ventricular devices. American Journal of Physiology. Heart and Circulatory Physiology, 300(1):H319–25, 2011
  25. 25. A Paraschiv-Ionescu, E Buchser, B Rutschmann, and K Aminian. Nonlinear analysis of human physical activity patterns in health and disease. Physical Review E, Statistical, Nonlinear, and Soft Matter Physics, 77(2 Pt 1): 021913, 2008
  26. 26. Alfréd Rényi. On measures of entropy and information. Technical report, Hungarian Academy of Sciences, Budapest, Hungary, 1961
  27. 27. L. Shi, Y. Jiao, and B. Lu. Differential entropy feature for EEG-based vigilance estimation. In Proc. 35th Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society (EMBC), pages 6627–6630, 2013
  28. 28. Christoph Bandt and Bernd Pompe. Permutation entropy: A natural complexity measure for time series. Physical Review Letters, 88(17): 174102, 2002
  29. 29. Hamed Azami and Javier Escudero. Amplitude-aware permutation entropy: Illustration in spike detection and signal segmentation. Computer Methods and Programs in Biomedicine, 128:40–51, 2016
  30. 30. J. Fell and J. Roschke. Discrimination of sleep stages: A comparison between spectral and nonlinear EEG measures. Electroencephalography and Clinical Neurophysiology, 98(5):401–410, 1996
  31. 31. W. W. Burggren. Assessing physiological complexity. The Journal of Experimental Biology, 208:3221–3232, 2005
  32. 32. Ingrid Daubechies. The wavelet transform, time-frequency localization and signal analysis. IEEE Transactions on Information Theory, 36(5):961–1005, 1990
  33. 33. Farhad Kaffashi, Ryan Foglyano, Christopher G Wilson, and Kenneth A Loparo. The effect of time delay on approximate & sample entropy calculations. Physica D: Nonlinear Phenomena, 237(23):3069–3074, 2008
  34. 34. Luciano Zunino, Felipe Olivares, and Osvaldo A. Rosso. Permutation min-entropy: An improved quantifier for unveiling subtle temporal correlations. Europhysics Letters, 109: 10005, 2015
  35. 35. Seyyed Abed Hosseini and Mohammad Bagher Naghibi-Sistani. Emotion recognition method using entropy analysis of EEG signals. International Journal of Image, Graphics and Signal Processing, 3(5):30, 2011
  36. 36. Subha D Puthankattil and Paul K Joseph. Analysis of EEG signals using wavelet entropy and approximate entropy: A case study on depression patients. International Journal of Medical, Health, Pharmaceutical and Biomedical Engineering, 8(7):420–424, 2014
  37. 37. Xiang Jie, Rui Cao, and Li Li. Emotion recognition based on the sample entropy of EEG. Bio-Medical Materials and Engineering, 24(1): 1185–1192, 2014
  38. 38. Yong Zhang, Xiaomin Ji, and Suhua Zhang. An approach to EEG-based emotion recognition using combined feature extraction method. Neuroscience Letters, 633:152–157, 2016
  39. 39. Beatriz García-Martínez, Arturo Martínez-Rodrigo, Roberto Zangróniz Cantabrana, Jose Pastor García, and Raúl Alcaraz. Application of entropy-based metrics to identify emotional distress from electroencephalographic recordings. Entropy, 18(6):221, 2016
  40. 40. Wen-Lin Chu, Min-Wei Huang, Bo-Lin Jian, and Kuo-Sheng Cheng. Analysis of EEG entropy during visual evocation of emotion in schizophrenia. Annals of General Psychiatry, 16(1):34, 2017
  41. 41. Morteza Zangeneh Soroush, Keivan Maghooli, Seyed Kamaledin Setarehdan, and Ali Motie Nasrabadi. A novel method of EEG-based emotion recognition using nonlinear features variability and Dempster–Shafer theory. Biomedical Engineering: Applications, Basis and Communications, 30(04):1850026, 2018
  42. 42. Sara Bagherzadeh, K Maghooli, J Farhadi, and M Zangeneh Soroush. Emotion recognition from physiological signals using parallel stacked autoencoders. Neurophysiology, 50(6):428–435, 2018
  43. 43. Tian Chen, Sihang Ju, Xiaohui Yuan, Mohamed Elhoseny, Fuji Ren, Mingyan Fan, and Zhangang Chen. Emotion recognition using empirical mode decomposition and approximation entropy. Computers & Electrical Engineering, 72:383–392, 2018
  44. 44. Yimin Hou and Shuaiqi Chen. Distinguishing different emotions evoked by music via electroencephalographic signals. Computational Intelligence and Neuroscience, 2019:1–18, 2019
  45. 45. Yun Lu, Mingjiang Wang, Wanqing Wu, Yufei Han, Qiquan Zhang, and Shixiong Chen. Dynamic entropy-based pattern learning to identify emotions from EEG signals across individuals. Measurement, 150: 107003, 2020
  46. 46. Beatriz García-Martínez, Antonio Fernández-Caballero, Raúl Alcaraz, and Arturo Martínez-Rodrigo. Cross-sample entropy for the study of coordinated brain activity in calm and distress conditions with electroencephalographic recordings. Neural Computing and Applications, 2021
  47. 47. Sander Koelstra, Christian Mühl, Mohammad Soleymani, Jong-Seok Lee, Ashkan Yazdani, Touradj Ebrahimi, Thierry Pun, Anton Nijholt, and Ioannis Patras. DEAP: A database for emotion analysis using physiological signals. IEEE Transactions on Affective Computing, 3(1):18–31, 2012
  48. 48. Beatriz García-Martínez, Antonio Fernández-Caballero, Luciano Zunino, and Arturo Martínez-Rodrigo. Recognition of emotional states from EEG signals with nonlinear regularity-and predictability-based entropy metrics. Cognitive Computation, 13: 403–417, 2021
  49. 49. Murugappan Murugappan, Ramachandran Nagarajan, and Sazali Yaacob. Combining spatial filtering and wavelet transform for classifying human emotions using EEG signals. Journal of Medical and Biological Engineering, 31(1):45–51, 2011
  50. 50. Hong Peng, Bin Hu, Fang Zheng, Dangping Fan, Wen Zhao, Xuebin Chen, Yongxia Yang, and Qingcui Cai. A method of identifying chronic stress by EEG. Personal and Ubiquitous Computing, 17(7):1341–1347, 2013
  51. 51. Varun Bajaj and Ram Bilas Pachori. Detection of human emotions using features based on the multiwavelet transform of EEG signals. In Brain-Computer Interfaces. Springer, 2015
  52. 52. Mohsen Naji, Mohammd Firoozabadi, and Parviz Azadfallah. Emotion classification during music listening from forehead biosignals. Signal, Image and Video Processing, 9(6): 1365, 2015
  53. 53. Beatriz García-Martínez, Arturo Martínez-Rodrigo, Roberto Zangróniz, José Manuel Pastor, and Raúl Alcaraz. Symbolic analysis of brain dynamics detects negative stress. Entropy, 19(5): 196, 2017
  54. 54. Zhong Yin, Lei Liu, Li Liu, Jianhua Zhang, and Yagang Wang. Dynamical recursive feature elimination technique for neurophysiological signal-based emotion recognition. Cognition, Technology & Work, 19(4):667, 2017
  55. 55. Jing Chen, Bin Hu, Yue Wang, Philip Moore, Yongqiang Dai, Lei Feng, and Zhijie Ding. Subject-independent emotion recognition based on physiological signals: A three-stage decision method. BMC Medical Informatics and Decision Making, 17(3):45, 2017
  56. 56. L. Piho and T. Tjahjadi. A mutual information based adaptive windowing of informative EEG for emotion recognition. IEEE Transactions on Affective Computing, 11(4):722–735, 2018
  57. 57. Rami Alazrai, Rasha Homoud, Hisham Alwanni, and Mohammad Daoud. EEG-based emotion recognition using quadratic time-frequency distribution. Sensors, 18:2739, 2018
  58. 58. Hanshu Cai, Jiashuo Han, Yunfei Chen, Xiaocong Sha, Ziyang Wang, Bin Hu, Jing Yang, Lei Feng, Zhijie Ding, Yiqiang Chen, and Jürg Gutknecht. A pervasive approach to EEG-based depression detection. Complexity, 2018: 1–13, 2018
  59. 59. Jiahui Cai, Wei Chen, and Zhong Yin. Multiple transferable recursive feature elimination technique for emotion recognition based on EEG signals. Symmetry, 11:683, 2019
  60. 60. Abhishek Tiwari and Tiago H. Falk. Fusion of motif- and spectrum-related features for improved EEG-based emotion recognition. Computational Intelligence and Neuroscience, 2019:1–14, 2019
  61. 61. Beatriz García-Martínez, Arturo Martínez-Rodrigo, Antonio Fernández-Caballero, José Moncho-Bogani, and Raúl Alcaraz. Nonlinear predictability analysis of brain dynamics for automatic recognition of negative stress. Neural Computing and Applications, 32: 13221–13231, 2020
  62. 62. Mitul Kumar Ahirwal and Mangesh Ramaji Kose. Audio-visual stimulation based emotion classification by correlated EEG channels. Health and Technology, 10: 7–23, 2020
  63. 63. W. Zheng and B. Lu. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Transactions on Autonomous Mental Development, 7(3):162–175, 2015
  64. 64. W. Zheng, J. Zhu, and B. Lu. Identifying stable patterns over time for emotion recognition from EEG. IEEE Transactions on Affective Computing, 10(3):417–429, 2018
  65. 65. Jinpeng Li, Zhaoxiang Zhang, and Huiguang He. Hierarchical convolutional neural networks for EEG-based emotion recognition. Cognitive Computation, 10(2):368, 2018
  66. 66. Yang Li, Wenming Zheng, Zhen Cui, Yuan Zong, and Sheng Ge. EEG emotion recognition based on graph regularized sparse linear regression. Neural Processing Letters, 49:555–571, 2018
  67. 67. Z. Lan, O. Sourina, L. Wang, R. Scherer, and G. R. Müller-Putz. Domain adaptation techniques for EEG-based emotion recognition: A comparative study on two public datasets. IEEE Transactions on Cognitive and Developmental Systems, 11(1):85–94, 2018
  68. 68. T. Song, W. Zheng, P. Song, and Z. Cui. EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Transactions on Affective Computing, 11(3):532–541, 2018
  69. 69. Dong-Wei Chen, Rui Miao, Wei-Qi Yang, Yong Liang, Hao-Heng Chen, Lan Huang, Chun-Jian Deng, and Na Han. A feature extraction method based on differential entropy and linear discriminant analysis for emotion recognition. Sensors, 19:1631, 2019
  70. 70. Haiyun Huang, Qiuyou Xie, Jiahui Pan, Yanbin He, Zhenfu Wen, Ronghao Yu, and Yuanqing Li. An EEG-based brain computer interface for emotion recognition and its application in patients with disorder of consciousness. IEEE Transactions on Affective Computing, pages 1–1, 2019
  71. 71. Y. Li, W. Zheng, L. Wang, Y. Zong, and Z. Cui. From regional to global brain: A novel hierarchical spatial-temporal neural network model for EEG emotion recognition. IEEE Transactions on Affective Computing, page 1, 2019
  72. 72. Soheil Keshmiri, Masahiro Shiomi, and Hiroshi Ishiguro. Entropy of the multi-channel EEG recordings identifies the distributed signatures of negative, neutral and positive affect in whole-brain variability. Entropy, 21: 1228, 2019
  73. 73. J. Li, S. Qiu, Y.-Y. Shen, C.-L. Liu, and H. He. Multisource transfer learning for cross-subject EEG emotion recognition. IEEE Transactions on Cybernetics, 50(7):3281–3293, 2020
  74. 74. Z. Gao, X. Wang, Y. Yang, Y. Li, K. Ma, and G. Chen. A channel-fused dense convolutional network for EEG-based emotion recognition. IEEE Transactions on Cognitive and Developmental Systems, page 1, 2020
  75. 75. Sunhee Hwang, Kibeom Hong, Guiyoung Son, and Hyeran Byun. Learning CNN features from DE features for EEG-based emotion recognition. Pattern Analysis and Applications, 23:1323–1335, 2020
  76. 76. Chao Pan, Cheng Shi, Honglang Mu, Jie Li, and Xinbo Gao. EEG-based emotion recognition using logistic regression with Gaussian kernel and Laplacian prior and investigation of critical frequency bands. Applied Sciences, 10:1619, 2020
  77. 77. Yingdong Wang, Qingfeng Wu, Chen Wang, and Qunsheng Ruan. DE-CNN: An improved identity recognition algorithm based on the emotional electroencephalography. Computational and Mathematical Methods in Medicine, 2020:7574531, 2020
  78. 78. Chen Wei, Lan lan Chen, Zhen zhen Song, Xiao guang Lou, and Dong dong Li. EEG-based emotion recognition using simple recurrent units network and ensemble learning. Biomedical Signal Processing and Control, 58:101756, 2020
  79. 79. Yongqiang Yin, Xiangwei Zheng, Bin Hu, Yuang Zhang, and Xinchun Cui. EEG emotion recognition using fusion model of graph convolutional neural networks and LSTM. Applied Soft Computing Journal, 100:106954, 2021
  80. 80. Yelena Tonoyan, David Looney, Danilo P Mandic, and Marc M Van Hulle. Discriminating multiple emotional states from EEG using a data-adaptive, multiscale information-theoretic approach. International Journal of Neural Systems, 26(02):1650005, 2016
  81. 81. Arturo Martínez-Rodrigo, Beatriz García-Martínez, Raúl Alcaraz, Pascual González, and Antonio Fernández-Caballero. Multiscale entropy analysis for recognition of visually elicited negative stress from EEG recordings. International Journal of Neural Systems, 29(2):1850038, 2019
  82. 82. Kairui Guo, Rifai Chai, Henry Candra, Ying Guo, Rong Song, Hung Nguyen, and Steven Su. A hybrid fuzzy cognitive map/support vector machine approach for EEG-based emotion classification using compressed sensing. International Journal of Fuzzy Systems, 21:263–273, 2019
  83. 83. Arturo Martínez-Rodrigo, Beatriz García-Martínez, Luciano Zunino, Raúl Alcaraz, and Antonio Fernáindez-Caballero. Multi-lag analysis of symbolic entropies on EEG recordings for distress recognition. Frontiers in Neuroinformatics, 13(40), 2019
  84. 84. Qiang Gao, Chu han Wang, Zhe Wang, Xiao lin Song, En zeng Dong, and Yu Song. EEG based emotion recognition using fusion feature extraction method. Multimedia Tools and Applications, 79:27057–27074, 2020
  85. 85. A. Bhattacharyya, R. K. Tripathy, L. Garg, and R. B. Pachori. A novel multivariate-multiscale approach for computing EEG spectral and temporal complexity for human emotion recognition. IEEE Sensors Journal, 21(3): 3579–3591, 2021
  86. 86. M. Fraiwan, M. Alafeef, and F. Almomani. Gauging human visual interest using multiscale entropy analysis of EEG signals. Journal of Ambient Intelligence and Humanized Computing, 12:2435–2447, 2021
  87. 87. Gaetano Valenza, Antonio Lanata, and Enzo Pasquale Scilingo. The role of nonlinear dynamics in affective valence and arousal recognition. IEEE Transactions on Affective Computing, 3 (2):237–249, 2012

Written By

Beatriz García-Martínez, Antonio Fernández-Caballero and Arturo Martínez-Rodrigo

Reviewed: 11 May 2021 Published: 09 June 2021