Abstract
Mobile and wearable computers present input/output problems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds reduced task completion time, perceived annoyance, and allowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' gestures were more accurate when dynamically guided by audio-feedback. These novel interaction techniques demonstrate effective alternatives to visual-centric interface designs on mobile devices.
Original language | English |
---|---|
Pages | 473-480 |
Number of pages | 8 |
Publication status | Published - 28 Jul 2003 |
Event | The CHI 2003 New Horizons Conference Proceedings: Conference on Human Factors in Computing Systems - Ft. Lauderdale, FL, United States Duration: 5 Apr 2003 → 10 Apr 2003 |
Conference
Conference | The CHI 2003 New Horizons Conference Proceedings: Conference on Human Factors in Computing Systems |
---|---|
Country/Territory | United States |
City | Ft. Lauderdale, FL |
Period | 5/04/03 → 10/04/03 |
Keywords
- Gestural interaction
- Wearable computing