Using Eye-tracking Technology to Identify Learning styles- a quasi-experiment

Published: 6 August 2019| Version 1 | DOI: 10.17632/w7rb2h98jv.1
Contributor:
Zhanni Luo

Description

This dataset presents the gaze path and fixation data generated by different types of learners while they are viewing pictures-and-text with different arrangements. The learning styles in this study are based on the Felder and Silverman Learning Style Theory (FSLSM), in which learners were categorized into four groups: Active/Reflective, Sensing/Intuitive, Visual/Verbal and Global/Sequential.

Files

Steps to reproduce

Experiment: 1. Open the file “1 Experiment Materials”, import them into the Tobii eye-tracking system. 2. Mark Areas of Interest (AOIs) with the tool provided by the Tobii eye-tracker. The AOI areas are in the document “Coding system”. 3. Open the file “2 Forms: answer sheet, ILS scoring sheet and ILS questionnaire”, print them out. 4. Recruit participants and start the experiment. 5. Start the eye-tracking experiment, ask participants to fill “Document 2 Answer Sheet” in the file “2 Forms- answer sheet, ILS scoring sheet and ILS questionnaire” while reading. 6. After the eye-tracking experiment, give participants the “Document 3 ILS Scorning Sheet” and “Document 4 ILS questionnaire”. Ask them to complete the learning-style test. Data analysis: 7. Output the fixation data from the Tobii eye-tracker. Record the fixation data in the set AOIs. 8. Analyze the fixation data. 9. Output the gaze path images from the Tobii eye-tracker. 10. Predict the participants’ learning style based on their gaze paths. In my case, the file is “4 Raw Data: gaze path”. Write down the result. 11. Analyze the ILS survey responses collected in the sixth step in the experiment. Write down the result. 12. Compare the prediction with the ILS responses. Use ILS responses as the standard. Calculate the accuracy of identifying the four learning styles with eye-tracking technology.

Institutions

University of Canterbury

Categories

Applied Psychology, Education, Human-Machine Interface

Licence