Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Analysis and Practice of Nonverbal Communication and Attention in Autism with Virtual Reality Job Interviews Using Machine Learning

Abstract

Autism is a complex neurodevelopmental condition that influences how individuals act in social settings and process information. This condition is often characterized by differences in social communication that diverge from societal norms, typically viewed as the correct behavior. These behavioral mismatches may explain why only a small percentage of autistic individuals are employed, despite many seeking work. Although some companies are familiarizing their non-autistic (NA) employees with autism, most still lack neurodiversity hiring initiatives. To assist individuals who feel obliged to adjust their communication styles to fit NA norms, we designed virtual reality (VR) mock job interviews and developed algorithms for better behavioral tracking. We created a pipeline that can accurately detect head gestures using hidden Markov models and rate conversational engagement.

Then, we combined Kalman filtering and a clustering algorithm to improve the built-in eye-tracking of a VR headset. Using enhanced eye-tracking, we explored how autism impacts gaze behavior. This was the first VR study to investigate the importance of conversational role in two-person job interviews. We extended the findings of previous non-immersive studies to immersive VR. The users liked our tool as a self-deliverable job interview practice opportunity.

We then trained a neural network to predict head orientations in three-person virtual job interviews using a VR headset. Our model computed head rotation angles more accurately than conventional methods such as ray casting. The results aligned with our findings from two-person interviews. We observed how different neurotypes distribute their attention for different conversational roles, and how autism and external stimulants affect joint attention tendencies.

We built a convolutional bidirectional long short-term memory model that can accurately identify user leans (forward or backward) based solely on a headset and its controllers. Previous studies have developed similar models for similar pose estimation tasks; however, none aimed to recognize leans during a conversation, which was shown to be a signal of attention.

Lastly, we built a gaze behavior coaching framework that is more affordable than human coaching. Discussions with autistic individuals refined the coaching methodologies. After coaching, participants’ gaze behaviors generally approached the NA medians. This coaching framework was well-received.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View