InTech uses cookies to offer you the best online experience. By continuing to use our site, you agree to our Privacy Policy.

Computer and Information Science » Human-Computer Interaction » "Human-Computer Interaction", book edited by Inaki Maurtua, ISBN 978-953-307-022-3, Published: December 1, 2009 under CC BY-NC-SA 3.0 license. © The Author(s).

Chapter 16

Modelling and Evaluation of Emotional Interfaces

By Sylvia Tzvetanova Yung, Ming-Xi Tang and Lorraine Justice
DOI: 10.5772/7728

Article top

Overview

Visual of the screenshots’ flow of plain interface. The user browse trough the text content pages.
Figure 1. Visual of the screenshots’ flow of plain interface. The user browse trough the text content pages.
Visual of the screenshots’ flow of the adapted interface. The user browse trough the text content pages (on the top) and if the user shows dissatisfaction the systems switches to the images mode (on the bottom).
Figure 2. Visual of the screenshots’ flow of the adapted interface. The user browse trough the text content pages (on the top) and if the user shows dissatisfaction the systems switches to the images mode (on the bottom).
A graph of the gender differences in relation to emotional responses to the two interfaces.
Figure 3. A graph of the gender differences in relation to emotional responses to the two interfaces.
A graph of the motivation differences in relation to emotional responses to the two interfaces.
Figure 4. A graph of the motivation differences in relation to emotional responses to the two interfaces.
A graph of the personality factor differences in relation to emotional responses to the two interfaces.
Figure 5. A graph of the personality factor differences in relation to emotional responses to the two interfaces.
Overall graph showing the differences in the responses between two interfaces. The graph of the first interface is marked with square pointers and the graph of the second is marked with round pointers.
Figure 6. Overall graph showing the differences in the responses between two interfaces. The graph of the first interface is marked with square pointers and the graph of the second is marked with round pointers.
Presentation of the interface responses analysis for the first interface.
Figure 7. Presentation of the interface responses analysis for the first interface.
Presentation of the interface responses analysis for the second interface.
Figure 8. Presentation of the interface responses analysis for the second interface.
First impression of the interface analysis for the first interface.
Figure 9. First impression of the interface analysis for the first interface.
First impression of the interface analysis for the second interface.
Figure 10. First impression of the interface analysis for the second interface.
Overall happy vs. unhappy interface analysis for the first interface.
Figure 11. Overall happy vs. unhappy interface analysis for the first interface.
Overall happy vs. unhappy interface analysis for the second interface.
Figure 12. Overall happy vs. unhappy interface analysis for the second interface.
Usability of the interface analysis for the first interface.
Figure 13. Usability of the interface analysis for the first interface.
Usability of the interface analysis for the second interface.
Figure 14. Usability of the interface analysis for the second interface.
Design of the interface analysis for the first interface.
Figure 15. Design of the interface analysis for the first interface.
Design of the interface factor analysis for the second interface.
Figure 16. Design of the interface factor analysis for the second interface.
Organization of the interface analysis for the first interface.
Figure 17. Organization of the interface analysis for the first interface.
Organization of the interface analysis for the second interface.
Figure 18. Organization of the interface analysis for the second interface.

Modelling and Evaluation of Emotional Interfaces

Sylvia Tzvetanova Yung1, Ming-Xi Tang and Lorraine Justice

1. Introduction

Studies in psychology have acknowledged the importance of emotions for human cognition, motivation, learning, decision-making, and intelligence (Darwin, 1965; Goleman, 1995; Davidson et al., 2003). In particular positive emotions can increase the motivation and attachment of people (Isen, 1992) and facilitate constructive learning processes (Kort et al., 2000). Therefore emotions can be considered in the field of HCI as the factors for building affective, efficient and satisfying interfaces.

In HCI, however, “thinking” was still separated from “feeling” ignoring the role emotions play in human cognition. In a large part the design research focused on usability issues in HCI, with the role of emotions in interactions largely being ignored or simplified. Because of the above described important properties of emotion, and due to the fact that interfaces become more and more involved in the everyday life of people, to date interface designers attempted to model more user-friendly applications using emotional interaction. In the chapter we argue that the users’ emotional responses to an interface performance, ways of interaction and content can be modelled, evaluated and supported. We look into the technological foundation of affective computing for the development of emotional interface design applications with the adoption of Artificial Intelligence and cognitive science researches. The goal of this research is to develop a framework for adaptive interface, which supports the users with positive emotion eliciting elements. Our research is based on the context of a computational and symbolic framework, which considers the general issues related to the theories of emotional appraisal in graphic interfaces, as well as specific issues concerning computational representation and the implementation of a design system.

In the chapter we present a methodology and a demonstration for the evaluation of an emotional interface with the experiment results of the implementation of several scenarios involving web based interface design. The evaluation results showed that our proposed model is able to recognize user’s emotions and respond with changes in its interface design to encourage a positive emotional state or to improve a negative emotional state of a user. Finally we present the conclusions of this research and outline the issues for further research relating to the improvement of the current model with a particular focus on how to model and represent possible emotional transitions inside the system that provides the knowledge and factors supporting emotional user interaction in web based design systems. The research is addressed to HCI specialists and aims to benefit studies in applying cognitive studies in HCI and in particular cognitive appraisals of emotions.

2. Scope of research

Research in emotion design is concerned with giving machines the ability of emotional intelligence such as recognizing and modelling human emotions, to appropriately communicate emotion, and respond to it affectively. According to Picard (1997 ) affective systems should possess the abilities to: recognize, express and manage emotions. Recognition is the ability to detect the user’s emotion, or make a belief about the user’s emotional state. Expressing emotions is about affective communication, where the machine is able to “express” emotions about its state, for instance when a machine runs out of power or when a machine demonstrates understanding of the user’s affect. Machines can appear to have emotions by “expressing” emotions. Managing emotions is concerned about the ability to respond to the user’s recognized emotional state in a way that the machine supports emotionally the user. Previously there are several attempts to model emotions in HCI. Laboratories in Europe and USA, as well as in Asia, have focused their research on improving emotional reactions between users and computers. The existing research on design and emotion in HCI can be divided by: systems expressing emotion and systems recognizing and reacting according to the understanding of other’s affect. Presently there exists a body of research literature on emotion design, which provides a basis for emotion-oriented computers from a technical perspective. The academics and industry developed emotion recognition tools and techniques using different sensing channels: physical sensing, (e.g. heart rate, galvanic skin response) self-reports, facial expressions, knowledge-based methods to derive likely affective state based on factors from current task context (e.g. type, complexity, time of day, length of task), personality and individual history (past failures and successes, affective state associated with current task) ( Picard & Healey 1997 , Burleson & Picard 2004, Hudlicka, 2003). Among this existing research, however, very few attempts really addressed exiting interaction issues and/or provided solutions for improving people’s performance with interfaces in the application areas such as Social Systems, Collaboration, Intelligent Tutoring Systems (ITS), Robots and others. Applications of emotions in interfaces can be endless. Here we provide a few examples that demonstrate the use of emotions to improve HCI.

The MIT’s huggable robot was developed to give emotional feedback to affective touch (Stiehl & Breazeal, 2005). The robot was designed for affective therapy with an emphasis on relational, affective touch interactions. When it is held, the robotic animal called Huggable would give a verbal and nonverbal feedback. In terms of technology, the Huggable has built in ”sensitive skin” consisting of temperature, force, and electric field sensors, underneath a soft layer of silicone skin to promote a pleasant tactile interaction. Huggable is equipped with small video cameras in the eyes and microphones in the ears that provide visual and auditory input to the sensors of the robot. Huggable possesses an inertial measurement unit that senses the body movement as it is held in someone’s arms. Huggable provides audio output with a speaker in its mouth and has seven actuators that provide motion for the neck, coupled eyebrow, coupled ear, and coupled shoulder mechanism. Equipped with simple technology, the robot is able to provide valuable help in treating children that have been violence victims for instance, where the machine is showing them the desirable behaviour.

Recently many of the studies addressed the efficiency of ITS and turned to emotions as a tool for enhancing user’s performance with ITS. As mentioned earlier, emotions help memorizing while the person attaches emotions to the material learned. In particular positive emotions are involved in the constructive learning process (Kort et al., 2001). Positive emotions also increase the learner’s motivation, and the learner is more focused on the material (Isen, 1992). Some of the ITS models had the goal to soften negative emotions and increase the positive ones, and to preserve the learner’s positive experiences in order to facilitate the constructive learning process. One of the first attempts to improve user’s learning performance was proposed by Burleson & Picard (2004). They built an affective companion that allowed the user to have optimal experiences and to sustain their motivation during studying by using verbal encouragement. Benchetrit & Frasson (2004)’s research focused on previous psychology knowledge in the pedagogical filed and proposed an agent that was based on the teaching method of the Bulgarian doctor and psychotherapist Lozanov (1978). However, the methods of the doctor were not confirmed and weren’t practiced widely. Motivation has been important for ITS as well. Vicente & Pain (2002) developed an ITS system with features that supported user’s motivation and reposted that the users had better performance when using such a system. Their model, based on affective theory that supports the user’s motivation demonstrated better results than other existing systems that motivate the student by using conventional motivation tools such as games, multimedia, etc.

Other applications try to adapt their interfaces to model the expression of user’s affective experience (Bianchi-Berthouze & Lisetti, 2002). Two example systems for kansei communication are MOUE (Model of User’s Emotions) that adaptively builds semantic definitions of emotion and MIKE (Multimedia Interactive Environment for Kansei Communication) that along with the user co-evolves a language for communicating over subjective impressions. On those systems, the user is teaching and/or showing with their interaction styles emotional patterns to the system. The two systems attempt to make possible for computers to track and understand human emotion by building databases of emotion definitions.

Since affect is a critical component of effective social interaction, some of the work concerning emotions in HCI focused on developing agents capable of socially effective interaction, and on embodied agents. Implemented in a computational model this can be achieved by using agents reasoning on goals, situations, and preferences (Carofiglio et al. 2006, Conati 2002). Carofiglio et al. (2006) observed the problem of overlapped emotions representing the complexity of human emotions in real social life (for instance the phenomenon to feel happy for somebody else) and proposed a computational model for making an artificial triggering of emotions over time and represent to an educational agent by using Dynamic Belief Networks (DBSs). Events occurred during a time interval (T,T+1) are observed to make a belief of the new state and reason about emotions that might be triggered by these events. The calculation of the intensity of emotions made by the uncertainty in the agent’s beliefs about the emotional state and the weight assigned to this emotion. The variation of intensity in the emotion is assigned by the probability that certain factor will take place, times the weight of this factor. de Rosis et al. (2003) also developed an animated agent with a 3-D human-like face that address the elements of social interaction. The agent is build upon a cognitive theory of emotions and generates its own affective state as a function of the situation and its current goals and expresses this state via speech tone, word choice, facial expressions, and head movement and gaze direction.

3. Design Methodology

While the technology has made attempts to implement emotion recognition and processing in interfaces, there is still need for defining & modelling emotional appraisal to support the user emotionally. The examples above demonstrated that emotions can provide better solutions for various machine interfaces for improvement of the existing systems not only in terms of pleasure, but also in terms of user performance. The technological foundation of affective computing, such as the different tools and databases, makes it possible to develop applications in this field. In a large part design research has focused on usability issues in HCI, and emotions can contribute to addressing these issues. The approaches to implementing emotions in interfaces are largely computational and there is need for more detailed research on the emotions aroused in the interaction process between machine and user. The hypothesis of this research is that user might have emotion toward an interface in terms of ways of interaction, colours, images, or content. Furthermore, as a natural shift in focus from usability to emotions in interfaces, the HCI approach can be taken to create models of emotional interfaces that operate to improve user performance.

Here we propose our formulation of an emotional interface design in the field of HCI. For this purpose the objectives of the research are to develop a methodology and a demonstration of an emotional interface. There are three main questions in relation to the goals of the research on emotional interface:

  • Which cognitive mechanisms studied in previous research on emotion are relevant to emotional appraisal in interfaces?

  • What are the factors in interfaces that influence user's emotions?

  • How can emotion be modelled in online graphic interfaces?

To address the questions the research methodology undertaken for the formulation of the emotional interface consisted of literature review on the theoretical basis, qualitative and quantitative research and prototyping. In our work we have proposed a model for emotional interface, where the interface adapts its elements based on user’s emotion. A symbolic model was adopted from the cognitive psychology. In particular we selected the Ortony, Clore and Collince (1988) (OCC) cognitive model of emotions that was built with a computational application in mind. In short, to read cues, the OCC theory suggests that the emotions people experience depend on what they focus on in a situation. Emotions are categorized as positive or negative reactions to certain events, agents, or objects. When one is focusing on an event, it is interpreted as when one is interested in the consequences of the experienced past events with wondering of what is to follow on. When a person is focusing on agents one is attracted to actions, i.e., the functions of animate beings, inanimate objects or abstractions, such as institutions, and other contextual situations. When one is focusing on objects, one is interested in the properties of the object - such as aesthetics and surfaces. The model is represented as an if-then mechanism, which models the emotional interactions. Based on the recognized emotion of users the system responds with changes in its interface in order to stimulate the user towards a positive emotional state. The OCC model was used as a theoretical basis for the emotional interface and the mechanism was further explored with qualitative research to customise it for a computer interface.

A qualitative research was conducted to further explore the factors in interfaces that influence the user’s emotions. A group of experts on design research were asked to comment on the existing interfaces and the elements influencing user’s emotions. The outcomes of the qualitative research were summarised in a branching hierarchy, which served as a foundation of the emotional interface symbolic model. Finally to test the hypothesis and to answer the research questions, a prototype of the interface was built to explore the validity of the models proposed. The interface was evaluated with users in an online survey, which is summarised below.

4. Evaluation

The emotional interface was evaluated with users by applying a usability approach in order to explore the functionality of the interface, as well as the hypothesis that emotional interface improves the performance of users. Comparison with the values inferred by the modelled emotional interface gave the researcher a chance to evaluate users’ preferences. The personal characteristics and their relation to user’s emotion, and the performance with adapted vs. plain interface are evaluated.

In an evaluation of emotions involved in an interaction process, subjects have to describe objects with words of an emotional nature. For the evaluation purposes in this work emotion sets are identified and these sets are used in the questionnaire for users to compare the existing interfaces with an emotional interface. It can be assumed that the users may have any kind of emotion, however how to identify from the long list of existing emotion words, is a question. In psychology researchers suggest about the existence of a basic set of small number of emotions that mixed can compose the entire rich spectrum of emotions. The set of basic emotions is largely discussed and it is not precisely defined for the reasons that different schools of thought believe in different sets of emotions are the basic. Theorists such as Oatley & Johanson-Laird (1987) used the concept of basic emotions in their psychology theories. Oatley & Johnson-Laird cited four basic emotions with evolutionary origins: happiness, sadness, fear, and anger. There are many more opinions on the basic set of emotions, the most popular which are those of: the psycho-evolutionary theory of Plutchik (e.g., acceptance, anger, anticipation, disgust, joy, fear, sadness, and surprise); Ekman, Friesen, and Ellsworth (e.g., anger, disgust, fear, joy, sadness, and surprise); and Aristotle (e.g., anger, mindless love, enmity, fear, confidence, shame, shamelessness, benevolence, pity, indignation, envy, emulation, and contempt). Here we adopt a list intended for interfaces, in particular for evaluation of ITS. Kort et al. (2001) constructed a scale of emotions related to learning ranging from negative to positive states. The set suggested by Kort et al. has the attribute of anxiety-confidence, boredom-fascination, frustration-euphoria, dispirited-encouraged, and terror-enchantment and the values assigned to the emotional words range from negative to positive (-1 to +1, and centred at 0). Here we adapted the list for the needs of the research and updated the emotional scale ranging from 0 to 4 for the evaluation. Based on this basic set an assumption was made about the possible emotions involved in interacting with web-based systems, see Table 1. The emotions are rated between 0 and 4 and centred at 2. Negative emotions are taken in the range between 0 and 2 and the positive emotions are assigned between 2 and 4. The emotions are grouped according to the system specifications that we observed in the survey.

RATING
TYPE01234
ContentEnthusiasticExcellentSatisfactoryNot SatisfactoryDisappointment
EducationalFascinationCaptivationInterestNo InterestBoredom
AppearanceJoyEnjoinmentIndifferenceRepulseDistress
OrganizationConfidentHopeComfortUncomfortedFear

Table 1.

Rated emotion sets for evaluation purposes.

An online questionnaire was distributed to the targeted users of Asia Pacific origin. No payment was offered to the subjects. A lucky draw for a small award was announced. At the end of the survey a small award could be drawn by one of the participants. The targeted users were young people aged 20 to 30 years old mainly from Hong Kong. Asian users were selected because different cultures may respond to interfaces with different emotions. Attitudes can influence cognition, in particular emotional appraisal, which depends on one's perception of the outside world in relation to one's goals and preferences. The cultural dependence of emotional appraisal has not been widely discussed. However, there is evidence of differences between cultures in relation to affect eliciting conditions (Markus & Kitayama 1991). In this research, differences are not sought. Rather in order to avoid confusion in the results the research is applied to an Asian context.

In the survey two online storytelling interfaces were compared. The stories originated from ethnic minorities in China and they are of similar length. The first interface was a plain user-friendly interface, which had no images (see Figure 1), with a white background and black typeface. Nevertheless it was easy to navigate. Furthermore this interface was designed according to the usability rules suggested by Nielsen (1999). The second interface evaluated in the online survey was an emotionally adapted interface (see Figure 2). The adapted interface differed from the plain one by having images and animations. These images and animations were activated automatically if the user showed negative emotion. The users were asked to read through the both stories to compare the two interfaces.

Two different stories were chosen so that each user would evaluate the two different interfaces, and the personality factor would not impact on the comparison. The hypothesis was that the users would engage more actively with the adapted interface and respond to it with positive emotion. For this purpose the participants were asked to report on their emotion toward the two interfaces in order to find out whether the adaptive mechanism was able to trigger positive emotion.

media/image1.png

Figure 1.

Visual of the screenshots’ flow of plain interface. The user browse trough the text content pages.

media/image2.png

Figure 2.

Visual of the screenshots’ flow of the adapted interface. The user browse trough the text content pages (on the top) and if the user shows dissatisfaction the systems switches to the images mode (on the bottom).

4.1. Online questionnaire setting

The survey consisted of 23 questions, with 5 general questions about the personal details of the participants and 9 questions about their emotional response to each story. This first group of questions from collected information about the personality and preferences of the participants, was included in order to identify the personal reasons of the users for their emotion. According to the OCC model (Ortony, Clore and Collince 1988), emotional appraisal depends on people's personal characteristics and concerns. The variables chosen for observation in this introductory group of questions of the survey were:

  • gender – it aimed to explore if the subject's gender have any relation to their emotional response,

  • motivation - this variable was chosen for observation in order to verify the importance of motivation in emotional appraisal of interfaces,

  • personality - this variable was chosen in order to identify the relation of the subject’s personality in general to the emotional appraisal of interfaces.

The second and third groups of questions were respectively about the first and the second interfaces. The same 9 questions were asked about each of the two stories. These questions were for identifying the emotional responses of the users to the two stories in order to compare the factors within the interfaces related to them. The variables observed were appearance and organization. Additionally, the users were asked to report on their feelings in general toward the two interfaces using the graded scale of emotions (see Table 1). The rated emotion set included:

  • Enthusiastic,

  • Disappointing,

  • Fascinating,

  • Boring,

  • Happy,

  • Distressing,

  • Confident,

  • Scary.

The question set was a comprehensive scale of graded emotions. The users could easily understand that each of the emotion factors varied from positive toward negative. The questionnaire also requested the subjects to comment on their emotional impressions in their own words.

4.2. Time recorder

Additionally, the time spent on each page of the two stories was recorded. A timer was set on each page to return the user's time spent on the page, the time of the day, and the total time needed to read through each story. As described in an earlier paper about the modelling of an emotional interface (Tzvetanova, 2007), some of the emotion mechanisms were based on the research findings of Beck (2005) on time response and its relation to emotion. The timer used in the survey was set to identify whether the implemented model of timing in the emotional interface tested showed a relationship to the user's emotion, and whether the timer had a positive influence on the user's emotions. The values of the time were returned on each subject for comparison with their emotional responses in the survey questionnaire. The time recorder used an action script communicating with a PHP script to return the values of the time and send them to an e-mail address for data recording.

5. Results

A total of 63 people took part in the survey, including 32 males, 28 females and 3 people whose genders were not returned. Most of these people were from Hong Kong. In general the participants reported that they were interested to read the two stories. A small number were not interested in the stories themselves, but were intrigued by the questionnaires.

In the survey several variables related to user factors were observed to see to what extent these variables influence the emotional appraisal. These included the influence of gender on the emotional preferences of the participants, and the influence of their personality on their emotional responses. The other variable observed was the factor of motivation, i.e., whether the participant found the online stories useful or not before reading the two stories and what their emotional responses were after they have read the two stories. The results were evaluated with an analysis of variance (ANOVA) statistical approach. Most researchers, in particular in the field of HCI, choose the ANOVA approach to their usability evaluation of multiple mixed variables (Park 2006).

5.1. User related factors

It was found that in general gender is not related to any significant difference in affective responses in the survey. However, the females responded with slightly more positive emotion on design, overall impression, and interface categories compared to the males, who responded more positively than the females on organization, presentation, and usability. The difference in responses between the two genders is shown in Figure 3. The results showed that the men were more likely to grade the two stories around neutral emotion compared to the women who were more likely to show higher arousal of emotion. In discussion about these results with Asian people, they commented: "Men will let women choose regarding aesthetics because men do not have an opinion". As far as the experiment indicates, in an Asian context, women express higher arousal and have more emotional concerns than men.

About motivation among the respondents to the survey, 53.7% thought that the online stories were useful, 25% thought that they were not and 12.3% did not have an opinion. Those who reported to have more motivation to read the stories also responded with slightly more positive emotions to most of the observed categories of the interfaces. See Figure 4.

The personality variable did not show any significant relation to the emotional responses to the two interfaces tested. For some of the observed elements of the two interfaces, the unhappy group reported more positive emotions than the group that was happy with the organization, presentation, and usability factors. The responses are shown in Figure 5.

media/image3.png

Figure 3.

A graph of the gender differences in relation to emotional responses to the two interfaces.

media/image4.png

Figure 4.

A graph of the motivation differences in relation to emotional responses to the two interfaces.

media/image5.png

Figure 5.

A graph of the personality factor differences in relation to emotional responses to the two interfaces.

5.2. Design factors

The survey showed both stories were easy to use. Reportedly there were no usability issues or unclear functions. The participants were driven by curiosity to fill in the questionnaires. 'First of all, I had not intended to read the story and I actually did not read it. But I would like to know what will happen to each paper. They seem more interesting than the story!' – one of them responded. The participants were asked to read through the two stories, which are very similar, coming from ethnic minorities in China. However, on the question about the content nearly one quarter of the people responded that the second interface was fascinating and captivating. The adapted interface evoked more positive emotion than the non-adapted one. Most of the users reported that the adapted interface was more interesting than the other, even though the emotional loading of the two stories was similar. Some of the people gave their personal impressions in their own words. Some people commented that the animation of the user-friendly plain interface was boring. Four people reported that they liked it because 'The animation of the word can make the description of the story more active'. Three people referred the animation as boring. Another function of the animation was that it drove the attention on a particular part of the story. Six people in their comments mentioned the relationship between the animated parts of the first interface and attention, and at the same time they did not find the interface’s elements emotionally positive (in their words, it doesn't affect so much, the story was boring when reading in a text form...nothing can help). The typeface of the plain interface got slightly more positive feedback than the animated elements. It was described by the subjects as 'elegant', 'elegant modern and fashion', 'light, go upward'. It was also described as 'it seems to let me feel peace', 'It is easy to read, but it is boring', 'It is black and white, comfortable to read'. On overall it was reported positive emotional response to the ease of use, even thou the interface was not designed with emotional appearance in mind.

The images from the second interface were commented on as well. Thirty-four people commented on the love story image of two famous actors in a love scene. The most common association, made by 5 people, was that this was a sad image of love. Others replied positively, with desire about the image content: 'the image could be more animated, show how they hug, and kiss, would be even more emotional', 'I get emotion for these actors, celebrities make me interested, and they are beautiful to look at'. The other images in the adapted interface in the survey received comments such as, “peaceful”, 'good, it makes me feel peace, refresh'.

On the whole, the second adapted interface got more positive evaluation in emotional appearance than the first one. The people were more likely to comment in one way or another that the adapted interface was emotionally arousing, compared to the plain user-friendly one, which was reported as good in terms of its usability factors (See Figure 6). In addition the second interface on overall got more positive emotions responses than the user-friendly plain interface averaging above the value of 2 (Satisfactory, Interest, Indifference, Comfort). The second interface’s usability was graded with a more positive emotion by the users compare to the first interface. Among the design elements of the first interface however usability feature was highest emotional response (between 1.5: uncomfortable and 2: comfortable).

media/image6.png

Figure 6.

Overall graph showing the differences in the responses between two interfaces. The graph of the first interface is marked with square pointers and the graph of the second is marked with round pointers.

The results show that the presentation of the evaluated interfaces was graded with a big difference in emotional response between the two stories (see Figures 7 and 8). The second interface presentation was more positively rated as satisfactory to excellent compared to the first, which was reported to be satisfactory to unsatisfactory. The participants’ first impression of the adapted interface was more positive than that of the plain user-friendly one, with 39.6% of the participants rating it as not interesting compared to the 37.3% who answered the same question on the adapted interface rating it as interesting (see Figures 9 and 10). The participants had almost the same emotional response to the two interfaces overall. 50.9% of the participants rated the first interface as satisfactorily presented and 47.1% rated the second and considered the story satisfactorily presented (see Figures 11 and 12). However, comparing the means of the results, the second story had the positive value of emotion of 0.8 compared to the negative value of emotion of – 0.3 for the first. It was expected that the users would rate the first interface as more user friendly, because it is straight forward and based on usability rules with, in particular, easy navigation and readable black text on white background, as previously suggested (Dix 1998, Nielsen 1999). However, the participants rated the usability of the first interface as comfortable to uncomfortable compared to the second interface, which was rated more towards comfortable (see Figures 13 and 14).

The participants had more positive emotional response to the second interface design than the first (see Figures 15 and 16). Overall, the adapted interface was rated as interesting to captivating on the emotional scale compared to the plain user-friendly interface, which was rated as not interesting. It was pointed out that the content itself arouses emotion and it is more important than the appearance. Some commented that 'my emotion comes from the story, the meaning of the words, not the graphic or animation, because the association between the visual and the story is not strong enough'. On average the people reported that they preferred the second interface because of the images and the dynamics. They commented: 'With more images, the second story is more attractive to me'. However, some other users expressed preferences for the simplified version and they said 'The first story was easy and simple and easy to read and the second story was complicated and difficult to relate and comprehend easily'. The participants had more positive emotional response to the adapted interface organization than the plain user-friendly interface organization (see Figures 17 and 18).

media/image7.png

Figure 7.

Presentation of the interface responses analysis for the first interface.

media/image8.png

Figure 8.

Presentation of the interface responses analysis for the second interface.

media/image9.png

Figure 9.

First impression of the interface analysis for the first interface.

media/image10.png

Figure 10.

First impression of the interface analysis for the second interface.

media/image11.png

Figure 11.

Overall happy vs. unhappy interface analysis for the first interface.

media/image12.png

Figure 12.

Overall happy vs. unhappy interface analysis for the second interface.

media/image13.png

Figure 13.

Usability of the interface analysis for the first interface.

media/image14.png

Figure 14.

Usability of the interface analysis for the second interface.

media/image15.png

Figure 15.

Design of the interface analysis for the first interface.

media/image16.png

Figure 16.

Design of the interface factor analysis for the second interface.

media/image17.png

Figure 17.

Organization of the interface analysis for the first interface.

media/image18.png

Figure 18.

Organization of the interface analysis for the second interface.

5.3. Timer

The timer showed that the majority of the people read the two stories carefully in the average time frame. Short periods of time measured on a page showed that some of the participants browsed first the two interfaces to see what would be next, and then went back to read the story. The majority of the instances of short time in the sequential browsing of a page were with the second interface. It is possible that the subjects got curious about the images rather than the story, as the images helped them understand the content or increased their curiosity.

6. Conclusions

This chapter presented a research methodology for and an evaluation of the design of an emotional interface. The evaluation showed that people do have emotional responses towards design elements in interfaces. The research has also shown that the Asian people prefer more colourful interfaces with images rather than plain interfaces. The main contribution of this research however, is the research methodology for modelling and evaluating emotions in interfaces. This work presents an approach that can be used for exploring the emotional responses towards interaction systems, as well as adapting interfaces according to user’s emotions.

Opposing previous research (Dix 1998, Nielsen 1999) users reported that they were better supported with an “emotional interface” than with a plain user-friendly interface. Despite that the adapted interface had more elements and was not as simplified as the plain interface the users rated the emotionally adapted interface as more user friendly. Emotional interface can be more efficient as previous research on psychology indicated (Darwin, 1965; Goleman, 1995; Davidson et al., 2003). Human emotions play an important role in our lives when we communicate with each other. Computers are an important everyday tool for communication, work, and learning and so on. Therefore user interfaces need to be more carefully planned based on the human cognition, rather than on machine reasoning. Emotions are essential for human cognition and consequently need to be appropriately addressed in HCI. Here we stressed the importance of emotions for user interfaces by evaluating the responses of the users to emotional interface. It was shown that such an interface has improved usability functions.

References

1 - J. E. Beck, 2005 Engagement tracing: using response times to model student disengagement, Artificial intelligence in education, 88 95 , Amsterdam, Netherlands
2 - O. Benchetrit, C. Frasson, 2004 Controlling emotional conditions for learning. Workshop on social and emotional intelligence in learning environments, In conjunction with the 7th International conference on intelligent tutoring systems, Maceio- Alagoas, Brazil
3 - N. Bianchi-Berthouze, C. L. Lisetti, 2002 Modeling multimodal expression of user’s affective subjective experience. User modeling and user-adapted interaction, 12 49 84 , February 2002
4 - W. Burleson, R. W. Picard, 2004 Affective agents: sustaining motivation to learn through failure and a state of stuck, Workshop on Social and Emotional Intelligence in Learning Environments, In conjunction with the 7th International Conference on Intelligent Tutoring Systems, Maceio Alagoas, B. (Ed.)
5 - V. Carofiglio, F. de Rosis, R. Grassano, (in press) Animating Expressive Characters for Social Interactions, Canamero, L. & Aylett, R. (ed.) Dynamic models of mixed emotion activation, Amsterdam: John Benjamins
6 - C. Conati, 2002 Probabilistic Assessment of User’s Emotions in Educational Games, Journal of Applied Artificial Intelligence, 16(7-8), 555 575
7 - C. Darwin, (1872/1965) The expression of emotions in man and animals. The university of Chicago press
8 - R. J. Davidson, K. Scherer, H. Goldsmith, 2003 Handbook of affective sciences. Oxford University Press
9 - F. de Rosis, C. Pelachaud, I. Poggi, V. Carofiglio, B. D. Carolis, 2003 From greta’s mind to her face: modelling the dynamics of affective states in a conversational embodied agent. International Journal of Human-Computer Studies- Special Issue on ”Applications of Affective Computing in HCI”, 81 118
10 - D. Goleman, 1995 Emotional intelligence: why it can matter more than IQ. New York: Bantam
11 - A. M. Isen, 1992 Positive Affect and Decision Making, Handbook of Emotions, Lewis, M. & Haviland, J. M. (Ed.), 261 277 , Guilford Press, New York
12 - E. Hudlicka, 2003 To feel or not to feel: The role of affect in human-computer interaction, International Journal of Human-Computer Studies, 59 1 32 , July 2003
13 - B. Kort, R. Reilly, R. W. Picard, 2001 External Representation of Learning Process and Domain Knowledge: Affective State as a Determinate of its Structure and Function, MIT Media Laboratory
14 - G. Lozanov, 1978 Suggestology and outlines of suggestopedy. Gordon and Breach
15 - H. R. Markus, S. Kitayama, 1991 Culture and the self: implications for cognition, emotion, and motivation. Psychological review, 2 224 253
16 - J. Nielsen, 1999 Designing Web Usability: The Practice of Simplicity. New Riders, Indianapolis, Indiana
17 - K. Oatley, P. Johanson-Laird, 1987 Towards a cognitive theory of emotion. Cognition and emotion, 1 20 50
18 - A. Ortony, G. Clore, A. Collins, 1988 The cognitive structure of emotions. Cambridge University Press
19 - R. Picard, 1997 Affective Computing. MIT Press
20 - R. Picard, J. Healey, 1997 Affective wearables, Personal and Ubiquitous Computing, 1 231 240
21 - W. D. Stiehl, C. Breazeal, 2005 Affective touch for robotic companions. First International Conference Affective Computing and Intelligent Interactions, Beijing, China
22 - Tzvetanova, , Tang, and Justice, 2006 Modelling Emotional Interfaces Using OutSite and InSite Factors, Design Computing and Cognition, Eindhoven, Netherlands, July 10 12
23 - A. Vicente, H. Pain, 2002 Informing the detection of the students’ motivational state: An empirical study. Intelligent Tutoring Systems, 6th International Conference, 933 943