Affective facial expressions recognition for human-robot interaction

Diego R. Faria, Mario Vieira, Fernanda C. C. Faria, Cristiano Premebida

Research output: Chapter in Book/Published conference outputConference publication

Abstract

Affective facial expression is a key feature of nonverbal behaviour and is considered as a symptom of an internal emotional state. Emotion recognition plays an important role in social communication: human-to-human and also for humanto-robot. Taking this as inspiration, this work aims at the
development of a framework able to recognise human emotions through facial expression for human-robot interaction. Features based on facial landmarks distances and angles are extracted to feed a dynamic probabilistic classification framework. The public online dataset Karolinska Directed Emotional Faces
(KDEF) [1] is used to learn seven different emotions (e.g. angry, fearful, disgusted, happy, sad, surprised, and neutral) performed by seventy subjects. A new dataset was created in order to record stimulated affect while participants watched video sessions to awaken their emotions, different of the KDEF dataset where participants are actors (i.e. performing expressions when asked to). Offline and on-the-fly tests were carried out: leave-one-out cross validation tests on datasets and on-the-fly tests with human-robot interactions. Results show that the proposed framework can correctly recognise human facial expressions with potential to be used in human-robot interaction scenarios
Original languageEnglish
Title of host publicationIEEE RO-MAN'17: IEEE International Symposium on Robot and Human Interactive Communication, Lisbon, Portugal
PublisherIEEE
Pages805-810
Number of pages6
Publication statusPublished - 1 Sept 2017

Bibliographical note

Copyright: IEEE

Fingerprint

Dive into the research topics of 'Affective facial expressions recognition for human-robot interaction'. Together they form a unique fingerprint.

Cite this