skip to main content
10.1145/3314111.3319841acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Towards a low cost and high speed mobile eye tracker

Published: 25 June 2019 Publication History

Abstract

Despite recent developments in eye tracking technology, mobile eye trackers (ET) are still expensive devices limited to a few hundred samples per second. High speed ETs (closer to 1 KHz) can provide improved flexibility for data filtering and more reliable event detection. To address these challenges, we present the Stroboscopic Catadioptric Eye Tracking (SCET) system, a novel approach for mobile ET based on rolling shutter cameras and stroboscopic structured infrared lighting. SCET proposes a geometric model where the cornea acts as a spherical mirror in a catadioptric system, changing the projection as it moves. Calibration methods for the geometry of the system and for the gaze estimation are presented. Instead of tracking common eye features, such as the pupil center, we track multiple glints on the cornea. By carefully adjusting the camera exposure and the lighting period, we show how one image frame can be divided into several bands to increase the temporal resolution of the gaze estimates. We assess the model in a simulated environment and also describe a prototype implementation that demonstrates the feasibility of SCET, which we envision as a step further in the direction of a mobile, robust, affordable, and high-speed eye tracker.

References

[1]
Sung Joon Ahn, Wolfgang Rauh, and Hans-Jürgen Warnecke. 2001. Least-squares orthogonal distances fitting of circle, sphere, ellipse, hyperbola, and parabola. Pattern Recognition 34, 12 (2001), 2283--2303.
[2]
Richard Andersson, Marcus Nyström, and Kenneth Holmqvist. 2010. Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research 3, 3 (2010).
[3]
Simon Baker and Shree K Nayar. 1999. A theory of single-viewpoint catadioptric image formation. International Journal of Computer Vision 35, 2 (1999), 175--196.
[4]
Sandro Barone, Marina Carulli, Paolo Neri, Alessandro Paoli, and Armando Viviano Razionale. 2018. An omnidirectional vision sensor based on a spherical mirror catadioptric system. Sensors 18, 2 (2018), 408.
[5]
Martin Böhme, Michael Dorr, Mathis Graw, Thomas Martinetz, and Erhardt Barth. 2008. A Software Framework for Simulating Eye Trackers. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 251--258.
[6]
Frank Borsato, Fernando Aluani, and Carlos Morimoto. 2015. A Fast and Accurate Eye Tracker Using Stroboscopic Differential Lighting. In 2015 IEEE ICCVW. 110--118.
[7]
F. H. Borsato and C. H. Morimoto. 2017. Building Structured Lighting Applications Using Low-Cost Cameras. In 2017 30th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI). 15--22.
[8]
F. H. Borsato and C. H. Morimoto. 2018. Asynchronous stroboscopic structured lighting image processing using low-cost cameras. In 2018 31th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI). http://urlib.net/rep/8JMKD3MGPAW/3RNU8QS?ibiurl.backgroundlanguage=en
[9]
Derek Bradley, Bradley Atcheson, Ivo Ihrke, and Wolfgang Heidrich. 2009. Synchronization and rolling shutter compensation for consumer video camera arrays. In CVPR Workshops 2009. IEEE, 1--8.
[10]
Andreas Bulling and Hans Gellersen. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 4 (2010), 8--12.
[11]
Juan J. Cerrolaza, Arantxa Villanueva, and Rafael Cabeza. 2008. Taxonomic Study of Polynomial Regressions Applied to the Calibration of Video-oculographic Systems. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 259--266.
[12]
Z Ramdane Cherif, A Nait-Ali, JF Motsch, and MO Krebs. 2002. An adaptive calibration of an infrared light device used for gaze tracking. In Instrumentation and Measurement Technology Conference, 2002. IMTC/2002. Proceedings of the 19th IEEE, Vol. 2. IEEE, 1029--1033.
[13]
Kai Dierkes, Moritz Kassner, and Andreas Bulling. 2018. A Novel Approach to Single Camera, Glint-free 3D Eye Model Fitting Including Corneal Refraction. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 9, 9 pages.
[14]
W. A. Douthwaite, T. Hough, K. Edwards, and H. Notay. 1999. The EyeSys videokeratoscopic assessment of apical radius and p-value in the normal human cornea. Ophthalmic and Physiological Optics 19, 6 (1999), 467--474.
[15]
Shaharam Eivazi, Thomas C. Kübler, Thiago Santini, and Enkelejda Kasneci. 2018. An Inconspicuous and Modular Head-mounted Eye Tracker. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 106, 2 pages.
[16]
Wolfgang Fuhl, David Geisler, Thiago Santini, Tobias Appel, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2018. CBF: Circular Binary Features for Robust and Real-time Pupil Center Detection. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 8, 6 pages.
[17]
Georg Glaeser. 1999. Reflections on spheres and cylinders of revolution. Journal for Geometry and Graphics 3, 2 (1999), 121--139.
[18]
Matthias Grundmann, Vivek Kwatra, Daniel Castro, and Irfan Essa. 2012. Calibration-free rolling shutter removal. In Computational Photography (ICCP), 2012 IEEE International Conference on. IEEE, 1--8.
[19]
Peng Han, Daniel R Saunders, Russell L Woods, and Gang Luo. 2013. Trajectory prediction of saccadic eye movements using a compressed exponential model. Journal of vision 13, 8 (2013), 27--27.
[20]
Dan Witzner Hansen and Qiang Ji. 2010. In the eye of the beholder: A survey of models for eyes and gaze. IEEE transactions on pattern analysis and machine intelligence 32, 3 (2010), 478--500.
[21]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. (April 2014). arXiv:cs-cv/1405.0006 http://arxiv.org/abs/1405.0006
[22]
Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human-computer interaction. In Advances in physiological computing. Springer, 39--65.
[23]
Christopher Mei and Patrick Rives. 2007. Single view point omnidirectional camera calibration from planar grids. In Robotics and Automation, 2007 IEEE International Conference on. IEEE, 3945--3950.
[24]
Jorge J Moré and Danny C Sorensen. 1983. Computing a trust region step. SIAM J. Sci. Statist. Comput. 4, 3 (1983), 553--572.
[25]
C. H. Morimoto, A. Amir, and M. Flickner. 2002. Detecting eye position and gaze from a single camera and 2 light sources. In Object recognition supported by user interaction for service robots, Vol. 4. 314--317 vol.4.
[26]
Carlos Hitoshi Morimoto, Dave Koons, Arnon Amir, and Myron Flickner. 2000. Pupil detection and tracking using multiple light sources. Image and vision computing 18, 4 (2000), 331--335.
[27]
Rhys Newman, Yoshio Matsumoto, Sebastien Rougeaux, and Alexander Zelinsky. 2000. Real-time stereo tracking for head pose and gaze estimation. In Automatic Face and Gesture Recognition, 2000. Proceedings. Fourth IEEE International Conference on. IEEE, 122--128.
[28]
Christian Nitschke, Atsushi Nakazawa, and Haruo Takemura. 2013. Corneal imaging revisited: An overview of corneal reflection analysis and applications. IPSJ Transactions on Computer Vision and Applications 5 (2013), 1--18.
[29]
OmniVision Technologies, Inc. 2009. OV5647 1/4 color CMOS QSXGA (5 megapixel) image sensor with OmniBSI technology.
[30]
OSRAM Opto Semiconductors. 2014. SFH 4715 OSLON Black Series (850 nm). http://www.osram-os.com/Graphics/XPic5/00100752_0.pdf
[31]
QImaging. 2014. Rolling Shutter vs. Global Shutter. Technical Report. 9 pages. https://www.qimaging.com/ccdorscmos/pdfs/RollingvsGlobalShutter.pdf
[32]
Scott A Read, Michael J Collins, Leo G Carney, and Ross J Franklin. 2006. The topography of the central and peripheral cornea. Investigative ophthalmology & visual science 47, 4 (2006), 1404--1415.
[33]
William Rosengren, Marcus Nystöm, Björn Hammar, and Martin Stridh. 2018. Suitability of Calibration Polynomials for Eye-tracking Data with Simulated Fixation Inaccuracies. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 66, 5 pages.
[34]
Davide Scaramuzza, Agostino Martinelli, and Roland Siegwart. 2006. A toolbox for easily calibrating omnidirectional cameras. In Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on. IEEE, 5695--5701.
[35]
Rahul Swaminathan, Michael D Grossberg, and Shree K Nayar. 2006. Non-single viewpoint catadioptric cameras: Geometry and analysis. International Journal of Computer Vision 66, 3 (2006), 211--229.
[36]
Tobii AB. 2018. Tobii Pro Glasses 2. https://www.tobiipro.com/siteassets/tobii-pro/product-descriptions/tobii-pro-glasses-2-product-description.pdf/?v=1.95
[37]
Jian-Gang Wang, Eric Sung, and Ronda Venkateswarlu. 2005. Estimating the eye gaze from one eye. Computer Vision and Image Understanding 98, 1 (2005), 83--103.
[38]
Zhengyou Zhang. 2000. A flexible new technique for camera calibration. IEEE Transactions on pattern analysis and machine intelligence 22 (2000).

Cited By

View all
  • (2024)Best low-cost methods for real-time detection of the eye and gaze trackingi-com10.1515/icom-2023-002623:1(79-94)Online publication date: 8-Jan-2024
  • (2020)Deep Neural Networks for Low-Cost Eye TrackingProcedia Computer Science10.1016/j.procs.2020.09.041176(685-694)Online publication date: 2020

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
June 2019
623 pages
ISBN:9781450367097
DOI:10.1145/3314111
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 June 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. catadioptric system
  2. mobile eye-tracking
  3. rolling shutter
  4. stroboscopic lighting

Qualifiers

  • Research-article

Funding Sources

  • Fundação de Amparo à Pesquisa do Estado de São Paulo

Conference

ETRA '19

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)22
  • Downloads (Last 6 weeks)1
Reflects downloads up to 25 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Best low-cost methods for real-time detection of the eye and gaze trackingi-com10.1515/icom-2023-002623:1(79-94)Online publication date: 8-Jan-2024
  • (2020)Deep Neural Networks for Low-Cost Eye TrackingProcedia Computer Science10.1016/j.procs.2020.09.041176(685-694)Online publication date: 2020

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media

-