Open access peer-reviewed chapter

A Control System for Detecting Emotions on Visual Interphase Stimulus

Written By

Fatima Isiaka, Kassim Mwitondi and Adamu M. Ibrahim

Submitted: 07 April 2017 Reviewed: 22 February 2018 Published: 04 July 2018

DOI: 10.5772/intechopen.75873

From the Edited Volume

Human-Robot Interaction - Theory and Application

Edited by Gholamreza Anbarjafari and Sergio Escalera

Chapter metrics overview

1,153 Chapter Downloads

View Full Metrics

Abstract

Complex dynamic contents of visual stimuli induce implicit reactions in a user. This leads to changes in physiological processes of the user which is referred to as stress. Our goal is to model and produce a system that represents the mechanical interactions of the body and eye movement behavior. We are particularly concerned with the skin conductance response (SCR) and eye fixations to visual stimulus and build a dynamic system that detects stress and its correlates to visual widgets. The process consists of the following modules: (1) a hypothesis generator for suggesting possible structural changes that result from the direct interaction with visual stimulus, (2) an information source for responding to operator querying about users’ interactive and physiological processes, and (3) a continuous system simulator for simulating and illustrating physiological reactions during interaction. This model serves as an infrastructure for modeling physiological processes and could be of benefit in usability laboratory, web developers, and designers of interactive systems, enabling evaluators to visualize interface as a better access to identifying areas that cause stress to users.

Keywords

  • physiological process
  • widgets
  • control systems
  • cognitive load
  • SCR
  • eye fixations

1. Introduction

A common finding in cognitive neuroscience [1] states that a person’s perception of their behavior does not always relate to their neural activity. Experiments have shown that people do not always know what is going on inside their minds. For instance, in an eye-tracking study that involved reading [2], real-time quantitative measures of eye movements revealed longer fixation times for reading text with transposed letters as compared to reading normal text even though readers claimed to spend just a few seconds on text with transposed letters. Also, electroencephalography (EEG) of language processing [3] has concluded that phrases judged as easy to comprehend and highly acceptable sometimes entail a larger processing effort on the part of the readers.

Control systems are sometimes used to understand the mechanism behind human computer interaction to provide industry-standard algorithms and applications that systematically analyze, design, and tune linear control systems. In this aspect, a system can be specified as a state-space model, transfer function, frequency-response model, or zero polegain. Some applications and functions, such as step response plot and Bode plot, let us visualize system behavior in time domain and frequency domain, which was observed in the result section of the chapter to analyze the behavior of our final system response. The compensator parameters are tuned using automatic, Bode loop shaping method in MATLAB; this was used to validate the design by verifying the rise time, settling time, phase, and gain margins. To understand the systems dynamics between eye movement and the visual stimuli, we adopt the second differential equation Eq. (1), which represents the prediction focus, 4 min from the detected fixations, and visual contents that induce stress. Therefore, the main objectives of this chapter include:

  • reviewing some related works,

  • development of a control system for detecting emotions,

  • testing the performance of process on multiple orders, and

  • simulating the physiological processes.

Advertisement

2. Related works

More work has been carried out in developing comprehensive web-based and software interfaces that can adapt to significant end-user needs. Some of this work requires the development of adaptive algorithms to learn about changes in user interests or emotions [4]. A flawless user interface (UI) would automatically adapt or change its layout and web content elements to suit the needs of the users and similarly allow for users themselves to alter the contents of the UI [5].

Users easily adapt to the less com plex applications due to the cognitive ability to easily familiarize themselves with friendly and well-designed interphases, such as those used as a means of information distribution and learning [6]. Visually complex applications change the way users view content [7]. Reactions from the users can relay quantitative information when physiological sensors are part of the equipment used to study and interpret perception.

Other state-of-the-art techniques, such as that written by Dean C Karnopp et al. [8] and that of Franziska Kretzschmar and Simon P Liversedge et al. [9, 10], try to find the mystery surrounding emotions, how they work, and how they affect our lives have which not yet been unraveled. But recent techniques such as [1113] developed a system and method provided for detecting emotional states, one of which uses statistics. A speech is first received and then an acoustic parameter is extracted from the speech signal. Then statistics or features from samples of the voice are calculated from extracted speech parameters. The features serve as inputs to a classifier, which can be a computer program, a device, or both. The classifier assigns at least one emotional state from a finite number of possible emotional states to the speech signal. Such techniques enable scientists to further debate the real nature of emotions, whether they are evolutionary, physiological, or cognitive to explain affective states. Results from applying this methodology on real-time data collected from a single subject demonstrated a recognition level of 71.4% which is comparable to the best results achieved. The detection mechanism outlined in this chapter has most of the characteristics required to perform emotion detection on real-time visual stimulus.

Advertisement

3. Methods

An experiment was conducted [14, 15] in which a single participant interacted with three different visual interphases—games, webpage, and a textbook. The user’s physiological readings were taken alongside the eye movement measured with an eye tracker. The rationale for choosing these interphases is derived from the fact that all stimuli contain dynamic contents and involve cognitive workload on the user that induces slight stress. For creating a system control capable of identifying users’ emotion on an interphase, MATLAB was used for its signal processing and system identifiable toolbox capable of developing dynamic systems. The following sections discuss these visuals and tasks involved.

3.1. Adera

Adera is a story-driven adventure game that involves a single player; it allows to solve puzzles, collect artifacts, and explore the environments to reveal the mysteries of a newfound civilization. The episodic story involved begins when the player receives a message from a missing person known as the grandfather (Hawk). The game involves some cognitive processing on the part of the user. The user interacts with the first episode, while his eye movement was taken.

3.2. Yahoo webpage

The Yahoo homepage is a very popular site where most regular users frequent for currents news and entertainment widgets. The user was simply asked to locate news or entertainment contents that were of interest and interact with while the physiological measures were taken.

3.3. Textbook

The textbook (The Designer) incentive involves locating an interesting phrase from a stimulus page that captures the reader’s attention at a single glance. All the tasks are contrived, simply to induce slight stress so we can observe the amplitudes or increase in physiological response in reaction to the visual interphase.

3.4. Physiological measures

The physiological measures adopted for the experiment includes SCR, skin temperature (ST), and eye movement (saccade and fixations). These user attributes are used to measure changes in reaction to dynamic contents on the visual interphase.

  1. SCR/ST: The SCR measures the electrical changes of the skin; it provides a functional signal of emotional responses by measuring the electrodermal (EDA) changes of the skin, caused because of sweat [16]. The skin temperature (ST) changes according to blood circulation at the surface of the skin through body tissue. In a state of increased emotion, such as interest or stress, muscle fibers contract and cause a stenosis of the vasculature [17, 18].

  2. Eye movement: This is the behavior of the eye during interaction; the eye-gaze pattern is a measure of behavior. The movement of a user’s eyes is based on fixations (location of a user’s eye gaze), saccades (rapid movements of the eye from one fixation to another) as indicated in Figure 1.

Figure 1.

Fixations and saccades of eye movement.

3.5. Hypothesis generator

The concepts behind modeling physiological processes involve setting to get a significant accuracy and a prediction focus close to original data model; the model adopts the concepts for physical processes on dynamic systems [8], a least squares technique applied to system controls. For a user attribute saved from the sensors, the entire system is represented by the expression in Eqs. (1) and (2); the model fit to data represents prediction focus, 4 min from the detected fixations, and corresponding stress levels.

dudy=Cuy+Fry+eE1
ym=Gry+Hry+eE2

where ym is the response variables (stress levels) that determine coefficients of physiological reactions with computed variables ry,C,F,G, and H are the estimated coefficients with noise e. The input generator is a discrete time-identified model fit representing the set of values the physiological reaction processes can take in response to dynamic visual stimuli. The primary data sources are measured in frequency (Herz and rads) and contain both categorical and numeric variables that contribute to making predictions on the multiple response output using multiple inputs (MIMO), for this case, yandym are the response outputs (Figure 2). This represents the affects’ states and fixations. The model is tested on the different estimated polynomial orders of the differential equation u(y). An identity state space object is created with C as the estimated initial state for the model, that is, the possible set of values the process can take, F is the estimated coefficients as a product of physiological parameters, G is the estimated output, and H the transformation matrix with noise e; y(m) is used to represent other response variables like the eye movement (fixations on the interphases). The threshold is set based on unique baseline such that:

thresh=0.5meanamplitudeminimumSCRE3

Figure 2.

Control system feedback configuration.

The alternate hypothesis is chosen if the rules does not apply to the null hypothesis, that is, H0: stress is constant if thresh<meanamplitude, Ha:stressfluctuatesifthresh>meanamplitude. The affect states to identify are stress, relaxed, and a neutral mood on the interphase. To detect the affect state, control is directed to a logical output in a loop which has a place holder for the dimensions of the physiological response.

The module to identify the optimal responses correlating to user’s mood is given in the following steps (Algorithm 1), the ‘FINDPEAKS’ function detects and finds phasic changes in physiological signals and high-level tonic phases (Appendix 1). Red indicates stress; blue and purple indicate relaxed and a neutral mood, respectively. The predicted fixations on the visual interphase indicate possible moves or positions for the user to reach their goal.

Algorithm to detect affect state on visual interphase.

1:procedurefindpeaks

2:peakslocsfindpeaksResponse.edamagenta'minpeakdistance15

3:mlengthlocs;

4:thresh0.5meanamplitudemineda;

5:top:

6:ifmeanpeaks>=thresh<=baselinethen return false

7:disp(‘stressed’)

8:elseif mean(peaks) < = thresh > = baseline.

9:

10:disp(‘neutral’)

11:else.

12:disp(‘relaxed’)

13:end.

14:end.

15:

16:loop:

17: ifsizeemotion2=01;then

18:emotion21. else

19:emotion2emotion2.

20:end.

21:emotion1 = strmatch(‘stressed’, PP12.Affectate(locs(m)));

22:emotion2 = strmatch(‘neutral’, PP12.Affectate(locs(m)));

23:gotoloop.

24:close;

25:XResponse.MappedFixationPointXlocsemotion1;.

26:YResponse.MappedFixationPointYlocsemotion1;.

27:XXResponse.MappedFixationPointXlocsemotion2;.

28:YYResponse.MappedFixationPointYlocsemotion2;.

29:gototop.

3.6. Operator querying

The mechanism involved is a direct synchronization of the two-sensor port while there is a querying model for the system output. This helps to identify optimal responses in the physiological measure that correlates to visual attention on the part of the user. The next exposition discusses analysis and findings from the control system.

Advertisement

4. Analysis and implementation

Possible strategies to locate the missing person in the Adera interphase is indicated by the predicted fixations. Rather than following the original eye movement (pink circles), the player can retrace the steps and identify the missing person by following the predictions. Four areas indicating stress mood were detected. One of these is on a commercial widget at the right upper edge of the interphase. The neural point is detected inside the game interphase. The predicted and original (natural) response lying in the same cartesian coordinate of the Adera game interphase (Figure 3) shows possibility of a high performance of the control model. The stress and neutral mood are indicated of the three affect states generated. One interesting aspect is the stress point at the area where a question mark is located on the interphase just close to the position where we have a pointing black arrow. This icon (question mark) is there to provide suggestions on which direction/strategy to take in locating the missing person. The user was seen to be undecided whether to make a move on and make use of the lifeline or simply ignore this icon, hence, the appearance of a stress indicator (neutral mood) on that area. The pattern or arrangement of fixations toward that direction indicates it might be the right strategy to take; these predicted points are indicated close to the question mark content. The user also experienced a stress mood, while looking at the advert section on the right side of the interphase. The participants’ physiological reaction toward that phase indicates more phasic changes, hence, there is a higher emotion indicator (stress and neutral moods) toward the interphase with an average baseline response of 2.72μs. At the point where a stress and a relaxed indicator intersects, a neutral indicator is produced (purple indicator).

Figure 3.

Detected affect state and correlating physiological reaction to Adera episode 1.

On the yahoo interphase (Figure 4), there are different dynamic and static contents that can distract the user and induce both positive and negative emotions. The user feels a neutral mood toward the dynamic picture content having the headline “Silicon Vas reacts to Trump inauguration” which is indicated by the neutral points close to and on the picture content. The user’s physiological reaction toward the interphase suggests an increase in amplitude between 20 and 40 s in the interaction. This interval correlates to the convoluted fixation points close to the dynamic picture content. The pattern for the predicted response lies in the same convoluted pattern as the original response.

Figure 4.

Detected affect state and correlating physiological reaction to yahoo page.

The reaction of the user to the book interphase stimulus suggests confusion and an undecided strategy to take when locating an interesting peace of phrase he might find interesting. This is with an average baseline on 3.1μ which is quite high for this interphase. The natural law of attention is that the gaze is directed to the center of the book, as seen by the convoluted arrangement of fixations of both the predicted and natural eye movement on one point at the center of the book interphase stimuli. The pattern of the predicted eye movement suggests the possible strategy he could take on that area to locate an interesting piece of phrase. Three affect indicators were located on this interphase, two of which are neutral and the other a stress mood. The emotion of the user can be seen and is indicated by the three-emotion detector on the spot (Figure 5).

Figure 5.

Detected affect state on Adera episode 1. (a)Textbook with original fixation and prediction fixation,(b)Detected emotion on Textbook,(c)Detected affect state and correlating physiological reaction to Textbook.

Advertisement

5. Results

The state variables used to estimate the coefficient of the control systems were defined by the input parameters which the signal provided. The sensors were used to generate the primary data that serves as the user attributes. These are the SCR, ST, and eye movement represented as fixations. Multiple polynomial orders were chosen to run the model. The best of these include polynomial order 1–3; this gives a precise representation of the physiological reaction to the dynamic contents on the interphase. This concept represents that which is applied to physical systems. This is very appropriate, if the goal is to predict and indicate emotion on visual interphase. We have to go beyond the normal approach to apply a multidimensional procedure to achieve the targeted objective. For the Adera Episode 1 interphase, both polynomial order one and three have the same phasic change when compared to polynomial order one. The phasic response of the MappedFX (input 3) with MappedFY (response variable) has the same but opposite reaction to the systems control; this is a positive and negative effect to the connection between the user and the Adera interphase. The SCR (input 1) has a positive effect in the connection, and this implies that it is a good indicator of the emotional response for the Adera interphase (Figure 6).

Figure 6.

Bode plot of the system on Adera interphase.

The magnitude of the response on all inputs has a positive impact to the systems’ interphase between the webpage (Figure 7a) and the user. For this case, applying polynomial order one and two have the same phasic change compared to three. The input ST have the same magnitude and phasic change for all polynomial order. The different variations in phasic changes are indicated in the input 3 and input 4 to the response output. On the other hand, all inputs have the same magnitude and phasic change for the polynomial orders used in the text-user interphase interaction; using order three and two runs on the system on average signifies possibility of a good performance on the system model. The response output illustrates the most significant response magnitude at order 3 (Figure 8b).

Figure 7.

Bode plot of the system on webpage interphase.

Figure 8.

Comparing original and predicted response.

Advertisement

6. Conclusion

This chapter has offered an evaluative perspective on an important aspect in user interphase on game, web, and text. By introducing a novel approach to user interaction and user physiological response, we integrated eye movement and physiological response to determine correlates that serve as tertiary indicators of the stress levels of a user based on attributes obtained from their physiological response. The benefits of the proposed model have proved to be reliable, given the results and findings from the response output of the system and the model has offered some solutions to the persistent user interaction and physiological association, which may be sustainable in the long-term with further evaluations and validations. The method used here provides an automated way of assessing human stress levels when dealing with specific visual contents. This is an important achievement and in that it is able to predict what contents on a visual interphase course stress-induced emotion in users during interaction. This could be applied to other areas like Internet security, triggering alarm for unauthorized access, or abnormal activities online which is the basis for our future work and also in testing performance. This chapter opens the way to possible benefits in terms of predicting human behavior in respect to Internet security by using the process as an alarm trigger for sending alerts on unauthorized access or abnormal activities online; this can be done by detecting user emotion on the visual interphase.

References

  1. 1. Andreassi JL. Psychophysiology: Human Behavior and Physiological Response. Psychology Press; 2000
  2. 2. Cooley R, Mobasher B, Srivastava J. Data preparation for mining world wide web browsing patterns. Knowledge and Information Systems. 1999;1(1):5-32
  3. 3. De Santos A, Sanchez-Avila C, Guerra-Casanova J, Pozo GB-D. Real-time stress detection by means of physiological signals. In: Recent Application in Biometrics. Rijeka: InTech; 2011
  4. 4. Demiral SB, Schlesewsky M, Bornkessel-Schlesewsky I. On the universality of language comprehension strategies: Evidence from turkish. Cognition. 2008;106(1):484-500
  5. 5. Isiaka F, Mwitondi K, Ibrahim A. Window based model for simulation of integrative human physiological response to webpages. In: Computing and Communication (IEMCON), 2015 International Conference and Workshop on. IEEE; 2015. pp. 1-8
  6. 6. Isiaka F, Mwitondi KS, Ibrahim AM. Detection of natural structures and classification of HCI-HPR data using robust forward search algorithm. International Journal of Intelligent Computing and Cybernetics. 2016;9(1):23-41
  7. 7. Kamon E, Pandolf K, Cafarelli E. The relationship between perceptual information and physiological responses to exercise in the heat. Journal of Human Ergology. 1974;3(1):45-54
  8. 8. Karnopp DC, Margolis DL, Rosenberg RC. System Dynamics: Modeling, Simulation, and Control of Mechatronic Systems. John Wiley and Sons; 2012
  9. 9. Kretzschmar F, Pleimling D, Hosemann J, Fussel S, Bornkessel-Schlesewsky I, Schlesewsky M. Subjective impressions do not mirror online reading effort: Concurrent eeg-eyetracking evidence from the reading of books and digital media. PLoS One. 2013;8(2):e56178
  10. 10. Liversedge SP, Blythe HI. Lexical and sublexical influences on eye movements during reading. Lang & Ling Compass. 2007;1(1-2):17-31
  11. 11. Mindfield. eSence temperature, mindfield biofeedback sytems. eSence Skin TemperatureHandbook. 2014;1(1):1-12
  12. 12. Mizokawa T.Control system for controlling object using pseudo-emotions and pseudo-personality generated in the object, US Patent 6,230,111, May 8 2001
  13. 13. Niedenthal PM, Ric F. Psychology of Emotion. Psychology Press; 2017
  14. 14. Paulson LD. Building rich web applications with ajax. Computer. 2005;38(10):14-17
  15. 15. Petrushin VA. Detecting emotions using voice signal analysis, US Patent 7,222,075, May 22 2007
  16. 16. Ramakrishnan S. Recognition of emotion from speech: A review. In: Speech Enhancement, Modeling and Recognition-Algorithms and Applications. Rijeka: InTech; 2012
  17. 17. Schneider-Hufschmidt M, Malinowski U, Kuhme T. Adaptive User Interfaces: Principles and Practice. Elsevier Science Inc.; 1993
  18. 18. Widyantoro DH, Ioerger TR, Yen J. An adaptive algorithm for learning changes in user interests. In: Proceedings of the Eighth International Conference on Information and Knowledge Management. ACM; 1999. pp. 405-412

Written By

Fatima Isiaka, Kassim Mwitondi and Adamu M. Ibrahim

Submitted: 07 April 2017 Reviewed: 22 February 2018 Published: 04 July 2018