Multimodal interaction with Loki

Pablo Bustos, Jesus Martinez-Gomez, Ismael Garcia-Varea, Pilar Bachiller, Luis V. Calderita, Luis J. Manso, Agustin Sanchez, Antonio Balderas

Research output: Chapter in Book/Published conference outputConference publication

Abstract

Developing a simple multimodal interaction game
with a 31 dof’s mobile manipulator can become a challenging
enterprise. A conceptually simple task quickly unfolds into a
rather complex ensemble of driver-oriented, framework-based,
software-enabled, state-machine controlled mechatronics. In this
paper we propose a multimodal interaction game designed to
test the initial steps of a cognitive robotics architecture called
RoboCog. In the game, a human shows an object to the robot
and asks him to touch it with one of his hands. Loki, the robot,
searches, gazes, represents and touches the object, then talks
and waits for new events. The game goes on until the human
player decides to quit. In this paper we describe the steps taken
to achieve this goal, analyzing the decisions made in terms of
architectural choices and describing how the sequential control of
multimodal resources was built. To conclude, several snapshots of
the game are presented and commented along with video material
of Loki playing with a volunteer.
Original languageEnglish
Title of host publicationProceedings of Workshop of Physical Agents
Pages53-60
Publication statusPublished - 2013
EventWorkshop on Physical Agents -
Duration: 1 Sept 20131 Sept 2013

Conference

ConferenceWorkshop on Physical Agents
Period1/09/131/09/13

Fingerprint

Dive into the research topics of 'Multimodal interaction with Loki'. Together they form a unique fingerprint.

Cite this