skip to main content
10.1145/3379156.3391366acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

The Perception Engineer’s Toolkit for Eye-Tracking data analysis

Published: 02 June 2020 Publication History

Abstract

Tools for eye-tracking data analysis are as of now either provided as proprietary software by the eye-tracker manufacturer or published by researchers under licenses that are problematic for some use-cases (e.g., GPL3). This lead to repeated re-implementation of the most basic building blocks, such as event filters, often resulting in incomplete, incomparable and even erroneous implementations.
The Perception Engineer’s Toolkit is a collection of basic functionality for eye-tracking data analysis double licensed with CC0 or MIT license that allows for easy integration, modification and extension of the codebase. Methods for data import from different formats, signal pre-processing and quality checking as well as several event detection algorithms are included. The processed data can be visualized as gaze density map or reduced to key metrics of the detected eye movement events. It is programmed entirely in python utilizing high performance matrix libraries and allows for easy scripting access to batch-process large amounts of data.
The code is available at https://bitbucket.org/fahrensiesicher/perceptionengineerstoolkit

References

[1]
Richard Andersson, Linnea Larsson, Kenneth Holmqvist, Martin Stridh, and Marcus Nyström. 2017. One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms. Behavior research methods 49, 2 (2017), 616–637.
[2]
Christoph Berger, Martin Winkels, Alexander Lischke, and Jacqueline Höppner. 2012. GazeAlyze: a MATLAB toolbox for the analysis of eye movement data. Behavior Research Methods 44, 2 (01 Jun 2012), 404–419. https://doi.org/10.3758/s13428-011-0149-x
[3]
Edwin S Dalmaijer, Sebastiaan Mathôt, and Stefan Van der Stigchel. 2014. PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behavior research methods 46, 4 (2014), 913–921.
[4]
Ralf Engbert and Reinhold Kliegl. 2003. Microsaccades uncover the orientation of covert attention. Vision research 43, 9 (2003), 1035–1045.
[5]
Darren R Gitelman. 2002. ILAB: a program for postexperimental eye movement analysis. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 605–612.
[6]
Prof Kenneth Holmqvist and Dr Richard Andersson. 2017. Eye Tracking - A Comprehensive Guide to Methods, Paradigms, and Measures (2. aufl. ed.). CreateSpace Independent Publishing Platform, Ort.
[7]
RM Hope. 2014. Gazetools: a collection of functions for processing and classifying eye gaze data.
[8]
Oleg V Komogortsev, Denise V Gobert, Sampath Jayarathna, Sandeep M Gowda, 2010. Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Transactions on Biomedical Engineering 57, 11 (2010), 2635–2645.
[9]
Oleg V Komogortsev and Alex Karpov. 2013. Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behavior research methods 45, 1 (2013), 203–215.
[10]
Vassilios Krassanakis, Vassiliki Filippakopoulou, and Byron Nakos. 2014. EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification. Journal of Eye Movement Research 7, 1 (Mar. 2014). https://doi.org/10.16910/jemr.7.1.1
[11]
Thomas C Kübler, Colleen Rothe, Ulrich Schiefer, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2017. SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies. Behavior research methods 49, 3 (2017), 1048–1064.
[12]
Thomas C Kübler, Katrin Sippel, Wolfgang Fuhl, Guilherme Schievelbein, Johanna Aufreiter, Raphael Rosenberg, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2015. Analysis of eye movements with Eyetrace. In International Joint Conference on Biomedical Engineering Systems and Technologies. Springer, 458–471.
[13]
Dario D Salvucci and Joseph H Goldberg. 2000. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on Eye tracking research & applications. 71–78.
[14]
T. Santini, W. Fuhl, T. C. Kübler, and E. Kasneci. 2016. Bayesian Identification of Fixations, Saccades, and Smooth Pursuits. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA). 163–170.
[15]
Adrian Voßkühler, Volkhard Nordmeier, Lars Kuchinke, and Arthur M Jacobs. 2008. OGAMA (Open Gaze and Mouse Analyzer): open-source software designed to analyze eye and mouse movements in slideshow study designs. Behavior research methods 40, 4 (2008), 1150–1162.
[16]
Stéfan van der Walt, S Chris Colbert, and Gael Varoquaux. 2011. The NumPy array: a structure for efficient numerical computation. Computing in Science & Engineering 13, 2 (2011), 22–30.
[17]
Stefan Winkler, Florian M Savoy, and Ramanathan Subramanian. 2014. X-Eye: A reference format for eye tracking data to facilitate analyses across databases. In Human Vision and Electronic Imaging XIX, Vol. 9014. International Society for Optics and Photonics, 90140L.

Cited By

View all
  • (2022)Context matters during pick-and-place in VR: Impact on search and transport phasesFrontiers in Psychology10.3389/fpsyg.2022.88126913Online publication date: 8-Sep-2022
  • (2021)Saliency-Aware Subtle Augmentation Improves Human Visual Search Performance in VRBrain Sciences10.3390/brainsci1103028311:3(283)Online publication date: 25-Feb-2021
  • (2021)Fixation: A universal framework for experimental eye movement research✱ACM Symposium on Eye Tracking Research and Applications10.1145/3448018.3458007(1-5)Online publication date: 25-May-2021
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and Applications
June 2020
305 pages
ISBN:9781450371346
DOI:10.1145/3379156
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 June 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. data analysis
  2. data processing
  3. event detection
  4. eye tracking
  5. gaze analysis
  6. signal processing

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Conference

ETRA '20

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)30
  • Downloads (Last 6 weeks)1
Reflects downloads up to 25 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Context matters during pick-and-place in VR: Impact on search and transport phasesFrontiers in Psychology10.3389/fpsyg.2022.88126913Online publication date: 8-Sep-2022
  • (2021)Saliency-Aware Subtle Augmentation Improves Human Visual Search Performance in VRBrain Sciences10.3390/brainsci1103028311:3(283)Online publication date: 25-Feb-2021
  • (2021)Fixation: A universal framework for experimental eye movement research✱ACM Symposium on Eye Tracking Research and Applications10.1145/3448018.3458007(1-5)Online publication date: 25-May-2021
  • (2021)TüEyeQ, a rich IQ test performance data set with eye movement, educational and socio-demographic informationScientific Data10.1038/s41597-021-00938-38:1Online publication date: 16-Jun-2021

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media

-