Gaze recording - Free observation of human faces

Published: 13 Jan 2020 | Version 2 | DOI: 10.17632/3kn4jdd4kf.2
Contributor(s):

Description of this data

(From the paper: V. Cantoni, C. Galdi, M. Nappi, M. Porta, D. Riccio, "GANT: Gaze ANalysis Technique for Human Identification". Pattern Recognition, Vol. 48, No. 4, 2015, pp. 1027-1038)

A total of 112 volunteer observers (73 males and 39 females) took part in the trials, subdivided into the following age groups: 17–18 (11 participants), 21–30 (58), 31–40 (9), 41–50 (16), 51–60 (8), 61–70 (9), and 71–80 (1). All participants reported normal or corrected-to-normal vision. Before the experiments, participants were informed about the fact that some images, without specifying their kind, would appear on the eye tracker's display in full screen mode.

Data were acquired through the Tobii 1750 eye tracker (1280×1024 screen resolution, 50 Hz sampling frequency), using human face images as experimental stimuli. The Tobii ClearView gaze recording software was used to define stimuli and record gaze data.

The images were interleaved with blank white screens with a cross at the center, to ensure a common starting location for stimulus exploration. The first blank screen was displayed for 5 s, while the others for 3 s. The presentation order of images was random.

Behind the eye tracker there was a wall painted in neutral gray and the illumination of the room was uniform and constant. All images had similar gray-level distributions. Each test was also preceded by a calibration procedure, lasting about 10 s and consisting in following a moving circle on the screen. Participants were then instructed to look at the cross when the blank screen was displayed, and to freely watch wherever they wanted when the images were presented. Each stimulus image was shown for 10 s.

Sixteen gray-level pictures (s1-s16) were employed containing close-up faces of eight males and eight females. Four males and four females were of famous people, while the others were of people unknown to the observers. Two extra images (s17 and s18) depicting landscapes were also included.

A first set S1 of tests were carried out in 2012 with 88 participants. Of these, 36 were involved in a second session (with the same images), and 16 other participants were also involved in a third test session. One hundred forty tests were thus carried out in the three sessions. Time intervals between the first and the second session, and between the second and the third session, ranged from a minimum of 5 days to a maximum of 9 days.

A second set S2 of tests, with 34 participants, were implemented after one year from S1 (2013). Ten observers in this group had been involved in S1 as well. Three sessions were organized: 17 observers out of 34 were involved in a second session (nine of whom had participated in S1), and 13 took part in a third session (six of whom had participated in S1). Sixty four tests were thus carried out in the three sessions. Time intervals between the first and the second session, and between the second and the third session, ranged from a minimum of one day to a maximum of 21 days.

Experiment data files

  • setS1
    Cite

    The zipped file 'setS1.zip' contains a folder with the files from the first round of experiments (first year). The 'data' folder contains eye data for the three experiment sessions of the first year: 'round1', 'round2', and 'round3'. The names of the folders in 'round1', 'round2', and 'round3' are the identifiers (numbers) of the individual tests. The Excel file 'participantsS1.xlsx' provides details about participants. In particular, for each tester it indicates which tests he or she participated in (folders in 'round1', 'round2', and 'round3'). The test folders contain the text files GZD of gaze samples (50 per second, since the frequency of the eye tracker is 50 Hz), FXD of fixations, and EVD of events. The GZD files indicate the time when each image is displayed. An Excel file contains the same data subdivided into sheets. Please refer to https://docplayer.net/8164129-User-manual-tobii-eye-tracker-clearview-analysis-software-february-2006-copyright-tobii-technology-ab.html for a description of the various kinds of recorded eye data. The following folders are also present: (1) 'pictures', with the images used in the experiments; (2) 'samplegazeplots', with the gazeplots of three tests; (3) 'samplegazereplays', with the gaze replays (videos) of three tests; and (4) 'samplehotspots', with the heatmaps of three tests.

  • setS2
    Cite

    The zipped file 'setS2.zip' contains a folder with the files from the second round of experiments (second year). The 'data' folder contains eye data for the three experiment sessions: 'round1', 'round2', and 'round3'. The Excel file 'participantsS2.xlsx' provides details about participants.

Latest version

  • Version 2

    2020-01-13

    Published: 2020-01-13

    DOI: 10.17632/3kn4jdd4kf.2

    Cite this dataset

    Cantoni, Virginio; Galdi, Chiara; Nappi, Michele; Porta, Marco; Riccio, Daniel; De Maio, Luigi; Distasi, Riccardo (2020), “Gaze recording - Free observation of human faces”, Mendeley Data, v2 http://dx.doi.org/10.17632/3kn4jdd4kf.2

Statistics

Views: 14
Downloads: 0

Previous versions

  • Version 1 (unavailable)

    2020-01-10

Compare to version

Institutions

Universita degli Studi di Salerno, Universita degli Studi di Pavia, Universita degli Studi di Napoli Federico II

Categories

Eye, Signal Tracking

Licence

CC BY 4.0 Learn more

The files associated with this dataset are licensed under a Creative Commons Attribution 4.0 International licence.

What does this mean?
You can share, copy and modify this dataset so long as you give appropriate credit, provide a link to the CC BY license, and indicate if changes were made, but you may not do so in a way that suggests the rights holder has endorsed you or your use of the dataset. Note that further permission may be required for any content within the dataset that is identified as belonging to a third party.

Report