Abstract
Humans have the ability to use a complex code of non-verbal behavior to communicate their internal states to others. Conversely, the understanding of intentions and emotions of others is a fundamental aspect of human social interaction. In the study presented here we investigate how people perceive the expression of emotional states based on the observation of different styles of locomotion. Our goal is to find a small set of canonical parameters that allow to control a wide range of emotional expressions. We generated different classes of walking behavior by varying the head/torso inclination, the walking speed, and the viewing angle of an animation of a virtual character. 18 subjects rated the observed walking person using the two-dimensional circumplex model of arousal and valence. The results show that, independent of the viewing angle, participants perceived distinct states of arousal and valence. Moreover, we could show that parametrized body posture codes emotional states, irrespective of the contextual influence or facial expressions. These findings suggest that human locomotion transmits basic emotional cues that can be directly related to canonical parameters of different dimensions of the expressive behavior. These findings are important as they allow us to build virtual characters whose emotional expression is recognizable at large distance and during extended periods of time.
Original language | English |
---|---|
Title of host publication | 2011 IEEE International Conference on Automatic Face & Gesture Recognition (FG) |
Publisher | IEEE |
Pages | 809-814 |
ISBN (Electronic) | 9781424491414 |
ISBN (Print) | 9781424491407 |
DOIs | |
Publication status | Published - 19 May 2011 |
Event | Gesture Recognition (FG 2011) - Santa Barbara, CA, USA Duration: 21 Mar 2011 → 25 Mar 2011 |
Conference
Conference | Gesture Recognition (FG 2011) |
---|---|
Period | 21/03/11 → 25/03/11 |