Exploring a user-defined gesture vocabulary for descriptive mid-air interactions

Hessam Jahani*, Manolya Kavakli

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


Gesturing provides an alternative interaction input for design that is more natural and intuitive. However, standard input devices do not completely reflect natural hand motions in design. A key challenge lies in how gesturing can contribute to human–computer interaction, as well as understanding the patterns in gestures. This paper aims to analyze human gestures to define a gesture vocabulary for descriptive mid-air interactions in a virtual reality environment. We conducted experiments with twenty participants describing two chairs (simple and abstract) with different levels of complexity. This paper presents a detailed analysis of gesture distribution and hand preferences for each description task. Comparisons are drawn between the proposed approach to the definition of a vocabulary using combined gestures (GestAlt) and previously suggested methods. The findings state that GestAlt is successful in describing the employed gestures in both tasks (60% of all gestures for simple chair and 69% for abstract chair). The findings can be applied to the development of an intuitive mid-air interface using gesture recognition.

Original languageEnglish
Pages (from-to)11-22
Number of pages12
JournalCognition, Technology and Work
Issue number1
Early online date7 Nov 2017
Publication statusPublished - 1 Feb 2018

Bibliographical note

Publisher Copyright:
© 2017, Springer-Verlag London Ltd., part of Springer Nature.


  • Gesture recognition
  • Gesture vocabulary
  • Human–computer interaction
  • Interface design
  • User-centered design
  • Virtual reality


Dive into the research topics of 'Exploring a user-defined gesture vocabulary for descriptive mid-air interactions'. Together they form a unique fingerprint.

Cite this