Multimodal 'Eyes-Free' interaction techniques for mobile devices

Research output: Chapter in Book/Report/Conference proceedingConference contribution

View graph of relations Save citation

Authors

Research units

Abstract


Mobile and wearable computers present input/output prob-lems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds re-duced task completion time, perceived annoyance, and al-lowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' ges-tures were more accurate when dynamically guided by au-dio-feedback. These novel interaction techniques demon-strate effective alternatives to visual-centric interface de-signs on mobile devices.

Request a copy

Request a copy

Details

Publication date4 May 2003
Publication titleCHI '03 : proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Place of PublicationNew York, NY (US)
PublisherACM
Pages473-480
Number of pages8
ISBN (Print)1-58113-630-7
Original languageEnglish
EventConference on Human Factors in Computing Systems - Fort Lauderdale, FL, United States
Duration: 5 Apr 200310 Apr 2003

Conference

ConferenceConference on Human Factors in Computing Systems
Abbreviated titleCHI'03
CountryUnited States
CityFort Lauderdale, FL
Period5/04/0310/04/03

DOI

Employable Graduates; Exploitable Research

Copy the text from this field...