Virtual Reality Interview with Feedback Framework for Situational Practice of Gaze Among Autistic Adults
Description
The high unemployment rates for autistic individuals are in part due to mismatches between their social patterns of attention and gaze, and society’s normative expectations during interviews. To help mitigate such disadvantages through a solo situational practice tool, we developed a virtual reality (VR) based job interview simulation and a gaze behavior coaching which is self-deliverable. We used the HTC Vive Pro Eye VR headset. Fourteen autistic individuals used the VR job interview simulation tool. Eleven went through the coaching step and participated in a second VR simulation session. Preliminary results were positive, in that participant behaviors generally approached non-autistic gaze trends. Participants perceived the tool to be useful and the provided feedback to be helpful in daily social interactions. The dataset contains 14 folders, one for each participant. There are 2 to 4 .csv files in each folder, depending on whether a participant completed a single VR mock job interview session or two. The files named "VR-Data-X.csv" contain unfiltered gaze data from an interview session. Files named VR-Data-X-gaze_filtered.csv are different because they reflect the changes due to gaze filtering at columns “object gazed at”, “forehead gaze (0=No, 1=Yes)”, “eyes gaze (0=No, 1=Yes)”, “mouth gaze (0=No, 1=Yes)”, “total forehead gaze duration (s)”, “total forehead gaze %”, “total eye gaze duration (s)”, “total eye gaze %”, “total mouth gaze duration (s)”, and “total mouth gaze %”. The column titles explain what each column contains. The “gaze location (x)”, “gaze location (y)”, “gaze location (z)” columns are in meters and the numbers are a gaze dot’s distance from the origin of the 3D virtual space along the x, y, or z axis.