Experimental results of localization of a virtual wall by means of active echolocation by untrained sighted subjects

Published: 6 September 2017| Version 1 | DOI: 10.17632/rr8fpnwysc.1
David Pelegrin Garcia,


We provide the raw and processed data of an experiment about "Localization of a virtual wall by means of active echolocation by untrained sighted persons". There were a total of 21 participants labelled S1 to S21 who did each 24 trials. In each trial, participants started from a random orientation and had to produce oral sounds and orient themselves towards a virtual wall at 0 degrees and a distance between 1 and 32 m, based on the auditory feedback. In particular, we provide experimental data for each subject (in the zip file experiment_log.zip), the raw recordings of the participants in each trial (raw_audio.zip) and the processed clicks (processed_clicks.zip). The experimental data / experimental log package contains files of the kind "SXX.mat". These are Matlab data files that contain a summary of the experimental conditions for participant SXX with XX from 1 to 24, orientation data and time stamps. In particular, each of these files contains the variables "Angles" as a cell array of 24 elements (one for trial) with a sequence of the participant's orientation at all time stamps, "Distances" with the distance of tha wall at each of the 24 trials, "SubjName" as a string "SXX" where personal data has been removed to respect confidentiality, and finally "TimeStamps" containing the time instants for which each of the angles was obtained. Note that the output variables "Angle" and "Response Latency" used in the article correspond to the last angle and last time stamp in each of the trials. The raw audio recordings are a set of files "SXXRunYY.aiff". These are audio files (mono, 16 bit, sampling rate of 44.1 kHz) that correspond to the recording of the microphone signal close to the mouth for participant XX from 1 to 21 at trial YY from 1 to 24. The processed clicks correspond to the individual clicks extracted from each raw recording, together with their acoustic properties. Thus, there is a file SXXRunYY.mat for each recording SXXRunYY.aiff, with the same numbering of XX and YY for participant and trial, respectively. Each file SXXRunYY.mat contains the data in a struct called "clickList", which contains the following fields: - audio: cell array of length N (N=number of clicks), where each elements contains each separate click in a 52 ms frame - startSample: index of the first sample of each of the N clicks in the raw audio file - Leq: Sound Pressure Level for each of the clicks, in dB Full Scale. Add 85 dB to obtain the Sound Exposure Level - Tclick: Duration in samples - Tatt: Attack time in samples - B: Bandwidth in Hz - Fmin: Minimum frequency within the bandwidth of the click in Hz - summary: contains the average parameters of the previous quantities for the current trial/participant.



Associatie KU Leuven, University of Surrey


Acoustics, Echolocation Behavior