skip to main content
10.1145/3314111.3319836acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

A novel gaze event detection metric that is not fooled by gaze-independent baselines

Published: 25 June 2019 Publication History

Abstract

Eye movement classification algorithms are typically evaluated either in isolation (in terms of absolute values of some performance statistic), or in comparison to previously introduced approaches. In contrast to this, we first introduce and thoroughly evaluate a set of both random and above-chance baselines that are completely independent of the eye tracking signal recorded for each considered individual observer. Surprisingly, our baselines often show performance that is either comparable to, or even exceeds the scores of some established eye movement classification approaches, for smooth pursuit detection in particular. In these cases, it may be that (i) algorithm performance is poor, (ii) the data set is overly simplistic with little inter-subject variability of the eye movements, or, alternatively, (iii) the currently used evaluation metrics are inappropriate. Based on these observations, we discuss the level of stimulus dependency of the eye movements in four different data sets. Finally, we propose a novel measure of agreement between true and assigned eye movement events, which, unlike existing metrics, is able to reveal the expected performance gap between the baselines and dedicated algorithms.

References

[1]
Ioannis Agtzidis, Mikhail Startsev, and Michael Dorr. 2016a. In the pursuit of (ground) truth: A hand-labelling tool for eye movements recorded during dynamic scene viewing. In 2016 IEEE Second Workshop on Eye Tracking and Visualization (ETVIS). 65--68.
[2]
Ioannis Agtzidis, Mikhail Startsev, and Michael Dorr. 2016b. Smooth pursuit detection based on multiple observers. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA '16). ACM, New York, NY, USA, 303--306.
[3]
Nantheera Anantrasirichai, Iain D Gilchrist, and David R Bull. 2016. Fixation identification for low-sample-rate mobile eye trackers. In 2016 IEEE International Conference on Image Processing (ICIP). 3126--3130.
[4]
Richard Andersson, Linnea Larsson, Kenneth Holmqvist, Martin Stridh, and Marcus Nyström. 2017. One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms. Behavior Research Methods 49, 2 (01 Apr 2017), 616--637.
[5]
F. Behrens, M. MacKeben, and W. Schröder-Preikschat. 2010. An improved algorithm for automatic detection of saccades in eye movement data and for calculating saccade parameters. Behavior Research Methods 42, 3 (01 Aug 2010), 701--708.
[6]
David J. Berg, Susan E. Boehnke, Robert A. Marino, Douglas P. Munoz, and Laurent Itti. 2009. Free viewing of dynamic stimuli by humans and monkeys. Journal of Vision 9, 5 (5 2009), 1--15. arXiv:/data/journals/jov/932860/jov-9-5-19.pdf
[7]
François Chollet et al. 2015. Keras. https://github.com/keras-team/keras.
[8]
Michael Dorr, Thomas Martinetz, Karl R Gegenfurtner, and Erhardt Barth. 2010. Variability of eye movements when viewing dynamic natural scenes. Journal of Vision 10, 10 (2010), 28--28.
[9]
Ralf Engbert and Reinhold Kliegl. 2003. Microsaccades uncover the orientation of covert attention. Vision Research 43, 9 (2003), 1035 -- 1045.
[10]
Ralf Engbert and Konstantin Mergenthaler. 2006. Microsaccades are triggered by low retinal image slip. Proceedings of the National Academy of Sciences 103, 18 (2006), 7192--7197. arXiv:http://www.pnas.org/content/103/18/7192.full.pdf
[11]
Roy S. Hessels, Diederick C. Niehorster, Marcus Nyström, Richard Andersson, and Ignace T. C. Hooge. 2018. Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers. Royal Society Open Science 5, 8 (2018). arXiv:http://rsos.royalsocietypublishing.org/content/5/8/180502.full.pdf
[12]
Sepp Hochreiter and Jürgen Schmidhuber. 1996. LSTM Can Solve Hard Long Time Lag Problems. In Proceedings of the 9th International Conference on Neural Information Processing Systems (NIPS'96). MIT Press, Cambridge, MA, USA, 473--479. http://dl.acm.org/citation.cfm?id=2998981.2999048
[13]
Ignace T. C. Hooge, Diederick C. Niehorster, Marcus Nyström, Richard Andersson, and Roy S. Hessels. 2017. Is human classification by experienced untrained observers a gold standard in fixation detection? Behavior Research Methods (19 Oct 2017).
[14]
Sabrina Hoppe and Andreas Bulling. 2016. End-to-End Eye Movement Detection Using Convolutional Neural Networks. ArXiv e-prints (Sept. 2016). arXiv:cs.CV/1609.02452
[15]
Tilke Judd, Frédo Durand, and Antonio Torralba. 2012. A benchmark of computational models of saliency to predict human fixations. http://hdl.handle.net/1721.1/68590.
[16]
T. Judd, K. Ehinger, F. Durand, and A. Torralba. 2009. Learning to predict where humans look. In 2009 IEEE 12th International Conference on Computer Vision. 2106--2113.
[17]
Enkelejda Kasneci, Gjergji Kasneci, Thomas C. Kübler, and Wolfgang Rosenstiel. 2015. Online Recognition of Fixations, Saccades, and Smooth Pursuits for Automated Analysis of Traffic Hazard Perception. In Artificial Neural Networks, Petia Koprinkova-Hristova, Valeri Mladenov, and Nikola K. Kasabov (Eds.). Springer International Publishing, Cham, 411--434.
[18]
Diederik P. Kingma and Jimmy Ba. 2014. Adam: A Method for Stochastic Optimization. CoRR abs/1412.6980 (2014). arXiv:1412.6980 http://arxiv.org/abs/1412.6980
[19]
Thomas Kinsman, Karen Evans, Glenn Sweeney, Tommy Keane, and Jeff Pelz. 2012. Ego-motion Compensation Improves Fixation Detection in Wearable Eye Tracking. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 221--224.
[20]
Oleg V. Komogortsev. 2014. Eye Movement Classification Software. http://cs.txstate.edu/~ok11/emd_offline.html.
[21]
Oleg V. Komogortsev, Sampath Jayarathna, Do Hyong Koh, and Sandeep Munikrishne Gowda. 2010. Qualitative and Quantitative Scoring and Evaluation of the Eye Movement Classification Algorithms. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA '10). ACM, New York, NY, USA, 65--68.
[22]
Oleg V. Komogortsev and Alex Karpov. 2013. Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behavior Research Methods 45, 1 (2013), 203--215.
[23]
Linnéa Larsson, Marcus Nyström, Richard Andersson, and Martin Stridh. 2015. Detection of fixations and smooth pursuit movements in high-speed eye-tracking data. Biomedical Signal Processing and Control 18 (2015), 145 -- 152.
[24]
Linnéa Larsson, Marcus Nyström, and Martin Stridh. 2013. Detection of Saccades and Postsaccadic Oscillations in the Presence of Smooth Pursuit. IEEE Transactions on Biomedical Engineering 60, 9 (Sept 2013), 2484--2493.
[25]
Stefan Mathe and Cristian Sminchisescu. 2012. Dynamic Eye Movement Datasets and Learnt Saliency Models for Visual Action Recognition. In Proceedings of the 12th European Conference on Computer Vision - Volume Part II (ECCV'12). Springer-Verlag, Berlin, Heidelberg, 842--856.
[26]
Olivier Le Meur and Zhi Liu. 2015. Saccadic model of eye movements for free-viewing condition. Vision Research 116 (2015), 152 -- 164. Computational Models of Visual Attention.
[27]
Parag K. Mital, Tim J. Smith, Robin L. Hill, and John M. Henderson. 2011. Clustering of Gaze During Dynamic Scene Viewing is Predicted by Motion. Cognitive Computation 3, 1 (01 Mar 2011), 5--24.
[28]
Gonzalo Navarro. 2001. A Guided Tour to Approximate String Matching. ACM Comput. Surv. 33, 1 (March 2001), 31--88.
[29]
Marcus Nyström and Kenneth Holmqvist. 2010. An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data. Behavior Research Methods 42, 1 (01 Feb 2010), 188--204.
[30]
Jorge Otero-Millan, Jose L. Alba Castro, Stephen L. Macknik, and Susana Martinez-Conde. 2014. Unsupervised clustering method to detect microsaccades. Journal of Vision 14, 2 (2014), 18. arXiv:/data/journals/jov/932814/i1534-7362-14-2-18.pdf
[31]
Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying Fixations and Saccades in Eye-tracking Protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (ETRA '00). ACM, New York, NY, USA, 71--78.
[32]
Thiago Santini. 2016. Automatic Identification of Eye Movements. http://ti.uni-tuebingen.de/Eye-Movements-Identification.1845.0.html.
[33]
Thiago Santini, Wolfgang Fuhl, Thomas Kübler, and Enkelejda Kasneci. 2016. Bayesian Identification of Fixations, Saccades, and Smooth Pursuits. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA '16). ACM, New York, NY, USA, 163--170.
[34]
Samuel L. Smith, Pieter-Jan Kindermans, and Quoc V. Le. 2018. Don't Decay the Learning Rate, Increase the Batch Size. In International Conference on Learning Representations. https://openreview.net/forum?id=B1Yy1BxCZ
[35]
Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research 15 (2014), 1929--1958. http://jmlr.org/papers/v15/srivastava14a.html
[36]
Mikhail Startsev, Ioannis Agtzidis, and Michael Dorr. 2016. Smooth Pursuit. http://michaeldorr.de/smoothpursuit/.
[37]
Mikhail Startsev, Ioannis Agtzidis, and Michael Dorr. 2018. 1D CNN with BLSTM for automated classification of fixations, saccades, and smooth pursuits. Behavior Research Methods (08 Nov 2018).
[38]
Mikhail Startsev and Michael Dorr. 2018. Increasing Video Saliency Model Generalizability by Training for Smooth Pursuit Prediction. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops.
[39]
Julian Steil, Michael Xuelin Huang, and Andreas Bulling. 2018. Fixation Detection for Head-mounted Eye Tracking Based on Visual Similarity of Gaze Targets. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 23, 9 pages.
[40]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2012. Detection of Smooth Pursuits Using Eye Movement Shape Features. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 177--180.
[41]
Raimondas Zemblys, Diederick C. Niehorster, and Kenneth Holmqvist. 2018a. gazeNet: End-to-end eye-movement event detection with deep neural networks. Behavior Research Methods (17 Oct 2018).
[42]
Raimondas Zemblys, Diederick C. Niehorster, Oleg V. Komogortsev, and Kenneth Holmqvist. 2018b. Using machine learning to detect events in eye-tracking data. Behavior Research Methods 50, 1 (01 Feb 2018), 160--181.

Cited By

View all
  • (2022)Evaluating Eye Movement Event Detection: A Review of the State of the ArtBehavior Research Methods10.3758/s13428-021-01763-755:4(1653-1714)Online publication date: 17-Jun-2022
  • (2022)RETRACTED ARTICLE: Eye tracking: empirical foundations for a minimal reporting guidelineBehavior Research Methods10.3758/s13428-021-01762-855:1(364-416)Online publication date: 6-Apr-2022
  • (2020)Evaluating three approaches to binary event-level agreement scoring. A reply to Friedman (2020)Behavior Research Methods10.3758/s13428-020-01425-053:1(325-334)Online publication date: 23-Jul-2020

Index Terms

  1. A novel gaze event detection metric that is not fooled by gaze-independent baselines

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
    June 2019
    623 pages
    ISBN:9781450367097
    DOI:10.1145/3314111
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 June 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. event detection
    2. eye movement classification
    3. random baseline

    Qualifiers

    • Research-article

    Funding Sources

    • Bavarian State Ministry of Science and the Arts

    Conference

    ETRA '19

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)17
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 25 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2022)Evaluating Eye Movement Event Detection: A Review of the State of the ArtBehavior Research Methods10.3758/s13428-021-01763-755:4(1653-1714)Online publication date: 17-Jun-2022
    • (2022)RETRACTED ARTICLE: Eye tracking: empirical foundations for a minimal reporting guidelineBehavior Research Methods10.3758/s13428-021-01762-855:1(364-416)Online publication date: 6-Apr-2022
    • (2020)Evaluating three approaches to binary event-level agreement scoring. A reply to Friedman (2020)Behavior Research Methods10.3758/s13428-020-01425-053:1(325-334)Online publication date: 23-Jul-2020

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media

    -