Normal hearing assessment of horizontal auditory localization accuracy in a virtual visual environment with free head movement

Published: 11 July 2022| Version 1 | DOI: 10.17632/nf9pd8vhc5.1
Contributors:
Andrea Gulli,
,

Description

37 right-handed sighted and normal hearing volunteers perform an auditory localization task in a virtual visual environment. We investigated the contribution of head movements to localization along the interaural plane in absence of motor constraints and visual cues on a virtual scene, experienced by individuals wearing a head-mounted display while listening to stimuli coming from a circular loudspeaker array. Our results indicate that the localization errors progressively contract toward a constant minimum with increasing head motion. Reported are: the participant anonymous id ("Subject"), the condition ("Condition": 1 participants could hear and see the array of loudspeakers while wearing an HMD on top of their head in ways that their vision was not occluded by the device; 2 participants instead wore the HMD correctly, so as to be visually immersed in a virtual environment (VE) meanwhile still listening to the sounds coming from the loudspeaker array), group ("Group": 1 performed first condition 1, then 2; 2 performed condition 2, then 1), target ("Target": angle in sexagesimal degrees), the signed error ("Signed_error": the difference between the target angle and the pointed angle), the unsigned error ("Unsigned_error": the absolute difference between the target angle and the pointed angle), the distance of the head from the target in the moment when it was hit ("Head_position": in meters), the difference between the target and head orientation angle at the same moment ("Head_rotation": in degrees), the head covered distance during a single trial ("Head_distance": in meters) the velocity and acceleration of the head’s translation movement with respect to the x-axis of the frontal plane (respectively "Head_translation_1" in meters per second and "Head_translation_2" in meters per squared second), the angular velocity and acceleration of the head with respect to the z-axis of the frontal plane (respectively "Head_rotation_1" in degrees per second and "Head_rotation_2" in degrees per squared second), the number of direction changes and rotation changes with respect to the translation and angular movements (respectively "Head_direction_changes" and "Head_rotation_changes"), the latency during a single trial ("Latency": in seconds).

Files

Steps to reproduce

37 right-handed sighted and normal hearing volunteers (12 male and 25 female, mean age: 31.95 ± 8.07 80 years) perform an auditory localization task in a virtual visual environment. We used Oculus Quest 2 (Facebook Reality Labs, Meta Platforms), consisting of an HMD and two handheld controllers, to present the virtual environment, and track the head position and the guessed sound source direction. The acoustic stimulus consisted of a sequence of 200 ms-pink noise bursts each surrounded by two 100 ms tapered linear ramps respectively forming the onset and offset. A new onset was separated by the previous offset by 200 ms of silence. A sequence of digital bursts at 16-bit with 44.1 kHz sampling frequency was created using Max (Cycling '74) and then sent to a MADIface USB 2.0 Audio Interface (RME). The stimulus was presented at a sound pressure level (SPL) equal to 65 dB ± 1 dB measured with a calibrated sound level meter (XL2 Sound Level Meter, NTi Audio).

Institutions

Universita degli Studi di Udine

Categories

Audiology, Sensation of Hearing, Virtual Reality, Sound

Licence