Analysis of Gaze, Head Orientation, and Joint Attention in Autism With Triadic VR Interviews

This study aims to support the situational solo practice of gaze behavior and head orientation using a triadic (three-way) virtual reality (VR) job interview simulation. The system lets users encounter common interview questions and see how they share attention among the interviewers based on their conversational role (speaking or listening). Given the yaw and position readings of the VR headset, we use a machine learning-based approach to analyze head orientations relative to the interviewers in the virtual environment, and achieve low angular error in a low complexity way. We examine the degree to which interviewer backchannels trigger attention shifts or behavioral mirroring and investigate the social modulation of gaze and head orientation for autistic and non-autistic individuals. In both speaking and listening roles, the autistic participants gazed at, and oriented towards the two virtual interviewers less often, and they displayed less behavioral mirroring (mirroring the head turn of one avatar towards another) compared to the non-autistic participants.
Source: IEE Transactions on Neural Systems and Rehabilitation Engineering - Category: Neuroscience Source Type: research