Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals
- PDF / 1,341,997 Bytes
- 16 Pages / 600 x 792 pts Page_size
- 81 Downloads / 138 Views
Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals Christine Lætitia Lisetti Department of Multimedia Communications, Institut Eurecom, 06904 Sophia-Antipolis, France Email: [email protected]
Fatma Nasoz Department of Computer Science, University of Central Florida, Orlando, FL 32816-2362, USA Email: [email protected] Received 30 July 2002; Revised 14 April 2004 We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI) and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users’ emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature) and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement). We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems. Keywords and phrases: multimodal human-computer interaction, emotion recognition, multimodal affective user interfaces.
1.
INTRODUCTION
The field of human-computer interaction (HCI) has recently witnessed an explosion of adaptive and customizable human-computer interfaces which use cognitive user modeling, for example, to extract and represent a student’s knowledge, skills, and goals, to help users find information in hypermedia applications, or to tailor information presentation to the user. New generations of intelligent computer user interfaces can also adapt to a specific user, choose suitable teaching exercises or interventions, give user feedback about the user’s knowledge, and predict the user’s future behavior such as answers, goals, preferences, and actions. Recent findings on emotions have shown that the mechanisms associated with emotions are not only tightly intertwined neurologically with the mechanisms responsible for cognition, but that they also play a central role in decision making, problem solving, communicating, negotiating, and adapting to unpredictable environments. Emotions are now therefore considered as organizing and energizing processes, serving important adaptive functions. To take advantage of these new findings, researchers in signal processing and HCI are learning more about the unsuspectedly strong interface between affect and cognition
in order to build appropriate digital technology. Affective states play an important role in many aspects of the activities we find ourselves involved in, including tasks performed in front of a computer or while interacting with computerbased technol
Data Loading...