Engaging human-to-robot attention using conversational gestures and lip-synchronization

Felipe Cid Burgos, Luis Jesús Manso Fernández-Arguelles, Luis V. Calderita, Agustín Sánchez, Pedro Núñez Trujillo

Research output: Contribution to journalArticlepeer-review

Abstract

Human-Robot Interaction (HRI) is one of the most important subfields of social robotics. In several applications,text-to-speech (TTS) techniques are used by robots to provide feedback to humans. In this respect, a natural synchronization between the synthetic voice and the mouth of the robot could contribute to improve the interaction experience. This paper presents an algorithm for synchronizing Text-To-Speech systems with robotic mouths. The proposed approach estimates the appropriate aperture of the mouth based on the entropy of the synthetic audio stream provided by the TTS system. The paper also describes the cost-efficient robotic head which has been used in the experiments and introduces the use of conversational gestures for engaging Human-Robot Interaction. The system,which has been implemented in C++ and can perform in real-time, is freely available as part of the RoboComp open-source robotics framework. Finally, the paper presents the results of the opinion poll that has been conducted in order to evaluate the interaction experience
Original languageEnglish
Pages (from-to)3-10
JournalJournal of Physical Agents (JoPha)
Volume6
Issue number1
DOIs
Publication statusPublished - 1 Jan 2012

Bibliographical note

This item is licensed under a Creative Commons License CC BY-SA 3.0.

Funding: This work has been partially supported by the Junta de
Extremadura project IB10062 and by the Spanish Ministry of
Science and Innovation with grant IPT-430000-2010-002.

Keywords

  • Robotics head
  • Lip Synchronization
  • Human Robot Interaction

Fingerprint

Dive into the research topics of 'Engaging human-to-robot attention using conversational gestures and lip-synchronization'. Together they form a unique fingerprint.

Cite this