Multimodal 'Eyes-Free' interaction techniques for mobile devices

Stephen Brewster, Joanna Lumsden, Marek Bell, Malcolm Hall, Stuart Tasker

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract


Mobile and wearable computers present input/output prob-lems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds re-duced task completion time, perceived annoyance, and al-lowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' ges-tures were more accurate when dynamically guided by au-dio-feedback. These novel interaction techniques demon-strate effective alternatives to visual-centric interface de-signs on mobile devices.
Original languageEnglish
Title of host publicationCHI '03 : proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Place of PublicationNew York, NY (US)
PublisherACM
Pages473-480
Number of pages8
ISBN (Print)1-58113-630-7
DOIs
Publication statusPublished - 4 May 2003
EventConference on Human Factors in Computing Systems - Fort Lauderdale, FL, United States
Duration: 5 Apr 200310 Apr 2003

Conference

ConferenceConference on Human Factors in Computing Systems
Abbreviated titleCHI'03
CountryUnited States
CityFort Lauderdale, FL
Period5/04/0310/04/03

Fingerprint

Mobile devices
Wearable computers
Feedback
Gesture recognition
Personal digital assistants
Acoustic waves

Cite this

Brewster, S., Lumsden, J., Bell, M., Hall, M., & Tasker, S. (2003). Multimodal 'Eyes-Free' interaction techniques for mobile devices. In CHI '03 : proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 473-480). New York, NY (US): ACM. https://doi.org/10.1145/642611.642694
Brewster, Stephen ; Lumsden, Joanna ; Bell, Marek ; Hall, Malcolm ; Tasker, Stuart. / Multimodal 'Eyes-Free' interaction techniques for mobile devices. CHI '03 : proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, NY (US) : ACM, 2003. pp. 473-480
@inproceedings{d768ef8517684328860a8d58fe9c0438,
title = "Multimodal 'Eyes-Free' interaction techniques for mobile devices",
abstract = "Mobile and wearable computers present input/output prob-lems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds re-duced task completion time, perceived annoyance, and al-lowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' ges-tures were more accurate when dynamically guided by au-dio-feedback. These novel interaction techniques demon-strate effective alternatives to visual-centric interface de-signs on mobile devices.",
author = "Stephen Brewster and Joanna Lumsden and Marek Bell and Malcolm Hall and Stuart Tasker",
year = "2003",
month = "5",
day = "4",
doi = "10.1145/642611.642694",
language = "English",
isbn = "1-58113-630-7",
pages = "473--480",
booktitle = "CHI '03 : proceedings of the SIGCHI Conference on Human Factors in Computing Systems",
publisher = "ACM",
address = "United States",

}

Brewster, S, Lumsden, J, Bell, M, Hall, M & Tasker, S 2003, Multimodal 'Eyes-Free' interaction techniques for mobile devices. in CHI '03 : proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY (US), pp. 473-480, Conference on Human Factors in Computing Systems, Fort Lauderdale, FL, United States, 5/04/03. https://doi.org/10.1145/642611.642694

Multimodal 'Eyes-Free' interaction techniques for mobile devices. / Brewster, Stephen; Lumsden, Joanna; Bell, Marek; Hall, Malcolm; Tasker, Stuart.

CHI '03 : proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, NY (US) : ACM, 2003. p. 473-480.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Multimodal 'Eyes-Free' interaction techniques for mobile devices

AU - Brewster, Stephen

AU - Lumsden, Joanna

AU - Bell, Marek

AU - Hall, Malcolm

AU - Tasker, Stuart

PY - 2003/5/4

Y1 - 2003/5/4

N2 - Mobile and wearable computers present input/output prob-lems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds re-duced task completion time, perceived annoyance, and al-lowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' ges-tures were more accurate when dynamically guided by au-dio-feedback. These novel interaction techniques demon-strate effective alternatives to visual-centric interface de-signs on mobile devices.

AB - Mobile and wearable computers present input/output prob-lems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds re-duced task completion time, perceived annoyance, and al-lowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' ges-tures were more accurate when dynamically guided by au-dio-feedback. These novel interaction techniques demon-strate effective alternatives to visual-centric interface de-signs on mobile devices.

UR - http://dl.acm.org/citation.cfm?id=642694

U2 - 10.1145/642611.642694

DO - 10.1145/642611.642694

M3 - Conference contribution

SN - 1-58113-630-7

SP - 473

EP - 480

BT - CHI '03 : proceedings of the SIGCHI Conference on Human Factors in Computing Systems

PB - ACM

CY - New York, NY (US)

ER -

Brewster S, Lumsden J, Bell M, Hall M, Tasker S. Multimodal 'Eyes-Free' interaction techniques for mobile devices. In CHI '03 : proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, NY (US): ACM. 2003. p. 473-480 https://doi.org/10.1145/642611.642694