Sensory Modality Influence on Human Reinforcement Learning: Different Response Time but Consistent Performance - Data and Stimuli

Published: 4 December 2023| Version 1 | DOI: 10.17632/y9pfcs6ptx.1
Contributor:
Merle Fairhurst

Description

This online study sought to investigate human learning behavior across the visual, auditory and haptic sensory modalities in a probabilistic selection task on computers and mobile devices. We examined reaction time, as an indicator for confidence, learning speed, and task accuracy. The haptic based probabilistic selection task showed the fastest reaction time, reinforcing the notion of heightened perceptual confidence in haptics. Conversely, visual stimuli processing exhibited the slowest reaction time and auditory based responses occupied an intermediate position. Despite the differences in reaction time across the senses, all modalities showed a striking consistency in both learning speed and task accuracy. The experiment was conducted using the online platform Gorilla. Experimental data acquired during the study is included in this dataset, and organised into the first block of training phase, all blocks of training phase, and evaluation phase. Auditory and visual stimuli are included in this data set, whilst haptic stimuli are listed as javascript code in the paper. For each sensory modality, 8 different sensory stimuli are created. Two stimuli are used only in the practice phase, and six are used for training and evaluation phases (details are described in the paper).

Files

Categories

Sensation of Hearing, Reinforcement Learning, Learning, Haptics, Multimodality, Vision

Funding

Deutsche Forschungsgemeinschaft

Bayerisches Forschungsinstitut für Digitale Transformation

Licence