Affective facial expressions recognition for human-robot interaction

Diego R. Faria, Mario Vieira, Fernanda C. C. Faria, Cristiano Premebida

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Affective facial expression is a key feature of nonverbal behaviour and is considered as a symptom of an internal emotional state. Emotion recognition plays an important role in social communication: human-to-human and also for humanto-robot. Taking this as inspiration, this work aims at the
development of a framework able to recognise human emotions through facial expression for human-robot interaction. Features based on facial landmarks distances and angles are extracted to feed a dynamic probabilistic classification framework. The public online dataset Karolinska Directed Emotional Faces
(KDEF) [1] is used to learn seven different emotions (e.g. angry, fearful, disgusted, happy, sad, surprised, and neutral) performed by seventy subjects. A new dataset was created in order to record stimulated affect while participants watched video sessions to awaken their emotions, different of the KDEF dataset where participants are actors (i.e. performing expressions when asked to). Offline and on-the-fly tests were carried out: leave-one-out cross validation tests on datasets and on-the-fly tests with human-robot interactions. Results show that the proposed framework can correctly recognise human facial expressions with potential to be used in human-robot interaction scenarios
Original languageEnglish
Title of host publicationIEEE RO-MAN'17: IEEE International Symposium on Robot and Human Interactive Communication, Lisbon, Portugal
PublisherIEEE
Pages805-810
Number of pages6
Publication statusPublished - 1 Sep 2017

Fingerprint

Human robot interaction
Robots
Communication

Bibliographical note

Copyright: IEEE

Cite this

Faria, D. R., Vieira, M., Faria, F. C. C., & Premebida, C. (2017). Affective facial expressions recognition for human-robot interaction. In IEEE RO-MAN'17: IEEE International Symposium on Robot and Human Interactive Communication, Lisbon, Portugal (pp. 805-810). IEEE.
Faria, Diego R. ; Vieira, Mario ; Faria, Fernanda C. C. ; Premebida, Cristiano. / Affective facial expressions recognition for human-robot interaction. IEEE RO-MAN'17: IEEE International Symposium on Robot and Human Interactive Communication, Lisbon, Portugal. IEEE, 2017. pp. 805-810
@inproceedings{fb8f86c163644bedbbc0ce91b4136918,
title = "Affective facial expressions recognition for human-robot interaction",
abstract = "Affective facial expression is a key feature of nonverbal behaviour and is considered as a symptom of an internal emotional state. Emotion recognition plays an important role in social communication: human-to-human and also for humanto-robot. Taking this as inspiration, this work aims at thedevelopment of a framework able to recognise human emotions through facial expression for human-robot interaction. Features based on facial landmarks distances and angles are extracted to feed a dynamic probabilistic classification framework. The public online dataset Karolinska Directed Emotional Faces(KDEF) [1] is used to learn seven different emotions (e.g. angry, fearful, disgusted, happy, sad, surprised, and neutral) performed by seventy subjects. A new dataset was created in order to record stimulated affect while participants watched video sessions to awaken their emotions, different of the KDEF dataset where participants are actors (i.e. performing expressions when asked to). Offline and on-the-fly tests were carried out: leave-one-out cross validation tests on datasets and on-the-fly tests with human-robot interactions. Results show that the proposed framework can correctly recognise human facial expressions with potential to be used in human-robot interaction scenarios",
author = "Faria, {Diego R.} and Mario Vieira and Faria, {Fernanda C. C.} and Cristiano Premebida",
note = "Copyright: IEEE",
year = "2017",
month = "9",
day = "1",
language = "English",
pages = "805--810",
booktitle = "IEEE RO-MAN'17: IEEE International Symposium on Robot and Human Interactive Communication, Lisbon, Portugal",
publisher = "IEEE",
address = "United States",

}

Faria, DR, Vieira, M, Faria, FCC & Premebida, C 2017, Affective facial expressions recognition for human-robot interaction. in IEEE RO-MAN'17: IEEE International Symposium on Robot and Human Interactive Communication, Lisbon, Portugal. IEEE, pp. 805-810.

Affective facial expressions recognition for human-robot interaction. / Faria, Diego R.; Vieira, Mario; Faria, Fernanda C. C.; Premebida, Cristiano.

IEEE RO-MAN'17: IEEE International Symposium on Robot and Human Interactive Communication, Lisbon, Portugal. IEEE, 2017. p. 805-810.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Affective facial expressions recognition for human-robot interaction

AU - Faria, Diego R.

AU - Vieira, Mario

AU - Faria, Fernanda C. C.

AU - Premebida, Cristiano

N1 - Copyright: IEEE

PY - 2017/9/1

Y1 - 2017/9/1

N2 - Affective facial expression is a key feature of nonverbal behaviour and is considered as a symptom of an internal emotional state. Emotion recognition plays an important role in social communication: human-to-human and also for humanto-robot. Taking this as inspiration, this work aims at thedevelopment of a framework able to recognise human emotions through facial expression for human-robot interaction. Features based on facial landmarks distances and angles are extracted to feed a dynamic probabilistic classification framework. The public online dataset Karolinska Directed Emotional Faces(KDEF) [1] is used to learn seven different emotions (e.g. angry, fearful, disgusted, happy, sad, surprised, and neutral) performed by seventy subjects. A new dataset was created in order to record stimulated affect while participants watched video sessions to awaken their emotions, different of the KDEF dataset where participants are actors (i.e. performing expressions when asked to). Offline and on-the-fly tests were carried out: leave-one-out cross validation tests on datasets and on-the-fly tests with human-robot interactions. Results show that the proposed framework can correctly recognise human facial expressions with potential to be used in human-robot interaction scenarios

AB - Affective facial expression is a key feature of nonverbal behaviour and is considered as a symptom of an internal emotional state. Emotion recognition plays an important role in social communication: human-to-human and also for humanto-robot. Taking this as inspiration, this work aims at thedevelopment of a framework able to recognise human emotions through facial expression for human-robot interaction. Features based on facial landmarks distances and angles are extracted to feed a dynamic probabilistic classification framework. The public online dataset Karolinska Directed Emotional Faces(KDEF) [1] is used to learn seven different emotions (e.g. angry, fearful, disgusted, happy, sad, surprised, and neutral) performed by seventy subjects. A new dataset was created in order to record stimulated affect while participants watched video sessions to awaken their emotions, different of the KDEF dataset where participants are actors (i.e. performing expressions when asked to). Offline and on-the-fly tests were carried out: leave-one-out cross validation tests on datasets and on-the-fly tests with human-robot interactions. Results show that the proposed framework can correctly recognise human facial expressions with potential to be used in human-robot interaction scenarios

M3 - Conference contribution

SP - 805

EP - 810

BT - IEEE RO-MAN'17: IEEE International Symposium on Robot and Human Interactive Communication, Lisbon, Portugal

PB - IEEE

ER -

Faria DR, Vieira M, Faria FCC, Premebida C. Affective facial expressions recognition for human-robot interaction. In IEEE RO-MAN'17: IEEE International Symposium on Robot and Human Interactive Communication, Lisbon, Portugal. IEEE. 2017. p. 805-810