Abstract
This research plans to investigate non-verbal social interaction in a virtual environment. Participants are required to interact with a virtual human (VH) in order to complete a puzzle using only eye-gaze. The project will focus on joint attention (JA) that can be categorised into two processes, initiation (IJA) and response (RJA). Firstly, we will investigate these categories by recording behaviours such as accuracy, completion times and the participants’ eye-tracking data to influence JA behaviour of the VH. This will later progress to investigate neural substrates using electrophysiological recordings of the brain (EEG). Based on previous research [1; 5], we expect differences to manifest in the neural signatures that are representative of each process. Manipulations will centre around the proximity of the character and the use of distractions; objects randomly appearing in the scene, eliciting overt and sometimes, covert gaze responses. The final stage of the project will incorporate a closed-loop system that is moderated by the EEG signals. This, in turn, will modulate the behaviour of the VH i.e. the proximity and/or extent of cooperation throughout the task.
Original language | English |
---|---|
Title of host publication | Proceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020 |
Publisher | IEEE |
Pages | 565-566 |
Number of pages | 2 |
ISBN (Electronic) | 978-1-7281-6532-5 |
ISBN (Print) | 978-1-7281-6533-2 |
DOIs | |
Publication status | Published - 11 May 2020 |
Event | 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) - Atlanta, GA, USA Duration: 22 Mar 2020 → 26 Mar 2020 |
Conference
Conference | 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) |
---|---|
Period | 22/03/20 → 26/03/20 |
Keywords
- ASD
- EEG
- Eye-gaze
- Joint Attention
- VR