NUIG_EyeGaze01(Labelled eye gaze dataset)

Published: 27 February 2019| Version 1 | DOI: 10.17632/cfm4d9y7bh.1
Contributors:
,

Description

The NUIG_EyeGaze01(Labelled eye gaze dataset) is a rich and diverse gaze dataset, built using eye gaze data from experiments done under a wide range of operating conditions from three user platforms (desktop, laptop, tablet) . Gaze data is collected under one condition at a time. The dataset includes gaze (fixation) data collected under 17 different head poses, 4 user distances, 6 platform poses and 3 display screen size and resolutions. Each gaze data file is labelled with the operating condition under which it was collected and has the name format: USERNUMBER_CONDITION_PLATFORM.CSV CONDITION: RP- Roll plus in degree PP- Pitch plus in degree YP- Yaw plus in degree RM- Roll minus in degree PM-Pitch minus in degree YM- Yaw minus in degree 50, 60, 70, 80: User distances PLATFORM: desk- Desktop, lap- Laptop, tab- Tablet Desktop display: 22 inch, 1680 x1050 pixels Laptop display: 14 inch, 1366x 768 pixels Tablet display: 10.1 inch 1920 x 800, pixels Eye tracker accuracy: 0.5 degrees (for neutral head and tracker position) The dataset has 3 folders called “Desktop”, “Laptop”, “Tablet” containing gaze data from respective platforms. The Desktop folder has 2 sub-folders: user_distance and head_pose. These have data for different user distances and head poses (neutral, roll, pitch, yaw )measured with desktop setup. The Tablet folder has 2 sub-folders: user_distance and tablet_pose,. These have data for different user distances and tablet+tracker poses (neutral, roll, pitch, yaw) measured with tablet setup . The Laptop folder has one sub-folder called user_distance which has data for different user distances, measured with laptop setup. All data files are in CSV format. Each file contains the following data header fields: ("TIM REL","GTX", "GTY","XRAW", "YRAW","GT Xmm", "GT Ymm","Xmm", "Ymm","YAW GT", "YAW DATA","PITCH GT", "PITCH DATA","GAZE GT","GAZE ANG", "DIFF GZ", "AOI_IND","AOI_X","AOI_Y","MEAN_ERR","STD ERR") The meanings of the header fields are as follows: TIM REL: relative time stamp for each gaze data point (measured during data collection) "GTX", "GTY": Ground truth x, y positions in pixels "XRAW", "YRAW": Raw gaze data x, y coordinates in pixels "GT Xmm", "GT Ymm": Ground truth x, y positions in mm "Xmm", "Ymm": Gaze x, y positions in mm "YAW GT", "YAW DATA": Ground truth and estimated yaw angles "PITCH GT", "PITCH DATA": Ground truth and estimated pitch angles "GAZE GT","GAZE ANG": Ground truth and estimated gaze angles "DIFF GZ": Gaze angular accuracy "AOI_IND","AOI_X","AOI_Y": Index of the stimuli locations and their x, y coordinates "MEAN_ERR","STD ERR": Mean and standard deviation of error at the stimuli locations For more details on the purpose of this dataset and data collection method, please consult the paper by authors of this dataset : Anuradha Kar, Peter Corcoran: Performance Evaluation Strategies for Eye Gaze Estimation Systems with Quantitative Metrics and Visualizations. Sensors 18(9): 3151 (2018)

Files

Steps to reproduce

The gaze data files can be used directly for analysis by loading the values in different columns. For any queries, contact: a.kar2@nuigalway.ie, anuradha.kar49@gmail.com, peter.corcoran@nuigalway.ie

Institutions

National University of Ireland Galway

Categories

Computer Vision, Human-Computer System Performance Evaluation, Consumer Electronics, Machine Learning, Machine Learning Algorithm, Cognitive Science, Augmented Reality, Virtual Reality, Clustering, Eye, Cognitive Vision, Information Classification, Human Machine Interaction, Vision, Modelling, Performance Evaluation, Model Evaluation

Licence