Multimodal 'eyes-free' interaction techniques for wearable devices

Stephen Brewster*, Joanna Lumsden, Marek Bell, Malcolm Hall, Stuart Tasker

*Corresponding author for this work

Research output: Unpublished contribution to conferenceUnpublished Conference Paperpeer-review


Mobile and wearable computers present input/output problems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds reduced task completion time, perceived annoyance, and allowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' gestures were more accurate when dynamically guided by audio-feedback. These novel interaction techniques demonstrate effective alternatives to visual-centric interface designs on mobile devices.

Original languageEnglish
Number of pages8
Publication statusPublished - 28 Jul 2003
EventThe CHI 2003 New Horizons Conference Proceedings: Conference on Human Factors in Computing Systems - Ft. Lauderdale, FL, United States
Duration: 5 Apr 200310 Apr 2003


ConferenceThe CHI 2003 New Horizons Conference Proceedings: Conference on Human Factors in Computing Systems
Country/TerritoryUnited States
CityFt. Lauderdale, FL


  • Gestural interaction
  • Wearable computing


Dive into the research topics of 'Multimodal 'eyes-free' interaction techniques for wearable devices'. Together they form a unique fingerprint.

Cite this