Data for: Luminance information is required for the accurate estimation of contrast in rapidly changing visual contexts

Published: 29 January 2020| Version 1 | DOI: 10.17632/p7xskvwktk.1
Contributors:
,
,
,
,
,

Description

Data corresponding to Ketkar, Sporar et al. (2020) are segregated for the experiment type (behavior and imaging) and figure number in the manuscript. All data are in Matlab format. Data from behavioral experiments bear filenames indicating genotype, neutral density (ND) filter used and in some cases, OFF edge specification (Figure 4D only). Each data file is a structure array with a number of elements corresponding to the number of flies. Fields of this structure contain an array of 5 cells, holding data corresponding to the 5 OFF edges with different background luminances (Figure 4D is the exception, where only the darkest OFF edge was shown). Fields important for the analysis described in the manuscript include: instantaneous yaw velocities in rad/s ('acc_dtheta'), forward walking speed in cm/s (‘acc_dfwd’), lateral displacement speed in cm/s (‘acc_dlat’). Cells in these fields contain 2D matrices where rows are instances of that particular OFF edge and columns correspond to the time points/frame numbers (sampled at 120 Hz, so each frame comprises 1/120 s). Data for each instance have been recorded in these matrices such that the first 0.2 s are part of the preceding inter-stimulus interval, followed by a static background display of 0.5 s, an OFF-edge motion of 0.75 s and finally the next inter-stimulus interval. The field ‘simstartsbytype’ includes the frame numbers counted from the beginning of the experiment, where stimulus instances corresponding to the five OFF edges started. ‘stimsignsbytype’ indicates whether each of these instance had a leftward (-1) or a rightward (1) OFF edge motion. The yaw velocities (‘acc_dtheta’) were mainly processed further and plotted in the manuscript figures. Data from in vivo two photon calcium or voltage imaging have undergone preliminary processing that includes movement correction, ROI selection and background subtraction. They are arranged in a way that the highest-order folder represents the Figure name and the subsequent folders correspond to individual figure calls and genotypes. Every .mat file carries information about the stimulus as well as the imaged time series. Information fields relevant for the analysis described in the manuscript include: time in milliseconds (stimTimes), stimulus type (stim_type), stimulus epoch in millisecond time (ch3) and in frames (fstimval) and the recorded GCaMP6f or ASAP2f fluorescence for each ROI (dRatio). For further clarifications, please contact the lead author, Marion Silies (msilies@uni-mainz.de).

Files

Categories

Neuroscience, Drosophila, Physiology of Visual System

Licence