skip to main content
10.1145/3314111.3319815acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Gaze behaviour on interacted objects during hand interaction in virtual reality for eye tracking calibration

Published: 25 June 2019 Publication History

Abstract

In this paper, we investigate the probability and timing of attaining gaze fixations on interacted objects during hand interaction in virtual reality, with the main purpose for implicit and continuous eye tracking re-calibration. We conducted an evaluation with 15 participants in which their gaze was recorded while interacting with virtual objects. The data was analysed to find factors influencing the probability of fixations at different phases of interaction for different object types. The results indicate that 1) interacting with stationary objects may be favourable in attaining fixations to moving objects, 2) prolonged and precision-demanding interactions positively influences the probability to attain fixations, 3) performing multiple interactions simultaneously can negatively impact the probability of fixations, and 4) feedback can initiate and end fixations on objects.

References

[1]
Dana H. Ballard, Mary M. Hayhoe, F. Li, Whitehead S. D., J. P. Frisby, J. G. Taylor, and R. B. Fisher. 1992. Hand-Eye Coordination during Sequential Tasks. Philosophical Transactions of the Royal Society B: Biological Sciences 337, 1281 (sep 1992), 331--339.
[2]
Dana H. Ballard, Mary M. Hayhoe, and Jeff B. Pelz. 1995. Memory Representations in Natural Tasks. Journal of Cognitive Neuroscience 7, 1 (dec 1995), 66--80.
[3]
Hans-Joachim Bieg, Lewis L. Chuang, Roland W. Fleming, Harald Reiterer, and Heinrich H. Bülthoff. 2010. Eye and Pointer Coordination in Search and Selection Tasks. In Proceedings of the 2010 Symposium on Eye Tracking Research & Applications (ETRA '10). ACM, New York, NY, USA, 89--92.
[4]
Jiu Chen and Qiang Ji. 2011. Probabilistic gaze estimation without active personal calibration. In Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR '11). IEEE, 609--616.
[5]
Jing Chen, Matteo Valsecchi, and Karl R. Gegenfurtner. 2016. LRP predicts smooth pursuit eye movement onset during the ocular tracking of self-generated movements. Journal of Neurophysiology 116, 1 (2016), 18--29.
[6]
Denise D. J. de Grave, Constanze Hesse, Anne-Marie Brouwer, and Volker H. Franz. 2008. Fixation locations when grasping partly occluded objects. Journal of vision 8, 7 (may 2008), 1--11.
[7]
David R. Flatla, Carl Gutwin, Lennart E. Nacke, Scott Bateman, and Regan L. Mandryk. 2011. Calibration Games: Making Calibration Tasks Enjoyable by Adding Motivating Game Elements. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST '11). ACM, New York, NY, USA, 403--412.
[8]
Argenis Ramirez Gomez and Hans Gellersen. 2017. GazeBall: Leveraging Natural Gaze Behavior for Continuous Re-calibration in Gameplay. Journal of Eye Movement Research 10, 6 (2017), 3. https://bop.unibe.ch/index.php/JEMR/article/view/4182
[9]
Argenis Ramirez Gomez and Hans Gellersen. 2018. Smooth-i: Smart Re-calibration Using Smooth Pursuit Eye Movements. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 10, 5 pages
[10]
Elias D. Guestrin and Moshe Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering 53, 6 (jun 2006), 1124--1133.
[11]
Werner F. Helsen, Digby Elliott, Janet L. Starkes, and Kathryn L. Ricker. 1998. Temporal and Spatial Coupling of Point of Gaze and Hand Movements in Aiming. Journal of Motor Behavior 30, 3 (sep 1998), 249--259.
[12]
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Jarodzka Halszka, and Joost van de Weijer. 2011. Eye Tracking : A Comprehensive Guide to Methods and Measures. Oxford University Press, Oxford, United Kingdom. 560 pages.
[13]
HTC. 2016. HTC Vive. Retrieved February 20, 2019 from https://www.vive.com
[14]
Jeff Huang, Ryen White, and Georg Buscher. 2012. User See, User Point: Gaze and Cursor Alignment in Web Search. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 1341--1350.
[15]
David E. Irwin, Angela M. Colcombe, Arthur F. Kramer, and Sowon Hahn. 2000. Attentional and oculomotor capture by onset, luminance and color singletons. Vision Research 40, 10 (2000), 1443 -- 1458.
[16]
Roland S. Johansson, Göran Westling, Anders Bäckström, and J. Randall Flanagan. 2001. Eye-Hand Coordination in Object Manipulation. Journal of Neuroscience 21, 17 (sep 2001), 6917--6932.
[17]
Tilke Judd, Krista Ehinger, Frédo Durand, and Antonio Torralba. 2009. Learning to predict where humans look. In 2009 IEEE 12th International Conference on Computer Vision. IEEE, 2106--2113.
[18]
Manu Kumar, Jeff Klingner, Rohan Puranik, Terry Winograd, and Andreas Paepcke. 2008. Improving the Accuracy of Gaze Input for Interaction. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 65--68.
[19]
Michael Land, Neil Mennie, and Jennifer Rusted. 1999. The Roles of Vision and Eye Movements in the Control of Activities of Daily Living. Perception 28, 11 (1999), 1311--1328.
[20]
Michael Land and Benjamin Tatler. 2012. Looking and Acting: Vision and Eye Movements in Natural Behaviour. Oxford University Press, Oxford, United Kingdom
[21]
Christian Lander, Frederic Kerber, Thorsten Rauber, and Antonio Krüger. 2016. A Time-efficient Re-calibration Algorithm for Improved Long-term Accuracy of Head-worn Eye Trackers. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA '16). ACM, New York, NY, USA, 213--216.
[22]
Guido Maiello, MiYoung Kwon, and Peter J. Bex. 2018. Three-dimensional binocular eye-hand coordination in normal vision and with simulated visual impairment. Experimental Brain Research 236, 3 (01 Mar 2018), 691--709.
[23]
Diederick C. Niehorster, Li Li, and Markus Lappe. 2017. The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. i-Perception 8, 3 (2017), 2041669517708205.
[24]
Marcus Nyström, Richard Andersson, Kenneth Holmqvist, and Joost van de Weijer. 2013. The influence of calibration method and eye physiology on eyetracking data quality. Behavior Research Methods 45, 1 (01 Mar 2013), 272--288.
[25]
Oculus VR. 2016. Oculus Rift. Retrieved February 20, 2019 from https://www.oculus.com/rift
[26]
Jeff Pelz, Mary Hayhoe, and Russ Loeber. 2001. The coordination of eye, head, and hand movements in a natural task. Experimental Brain Research 139, 3 (aug 2001), 266--277.
[27]
Carolyn J. Perry, Prakash Amarasooriya, and Mazyar Fallah. 2016. An Eye in the Palm of Your Hand: Alterations in Visual Processing Near the Hand, a Mini-Review. Frontiers in Computational Neuroscience 10 (2016), 37.
[28]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface. In Proceedings of the 27th Annual ACM Symposium on User Interface Soft-ware and Technology (UIST '14). ACM, New York, NY, USA, 509--518.
[29]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 373--383.
[30]
Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, and Hans Gellersen. 2017. Gaze + Pinch Interaction in Virtual Reality. In Proceedings of the 5th Symposium on Spatial User Interaction (SUI '17). ACM, New York, NY, USA, 99--108.
[31]
Ken Pfeuffer, Mélodie Vidal, Jay son Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13). ACM, New York, NY, USA, 261--270.
[32]
Patrick Renner, Nico Lüdike, Jens Wittrowski, and Thies Pfeiffer. 2011. Towards Continuous Gaze-Based Interaction in 3D Environments - Unobtrusive Calibration and Accuracy Monitoring. In Proceedings of the Workshop Virtuelle & Erweiterte Realität 2011, Christian-A. Bohn and Sina Mostafawy (Eds.). Shaker Verlag, Aachen, Germany, 13--24.
[33]
Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying Fixations and Saccades in Eye-tracking Protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (ETRA '00). ACM, New York, NY, USA, 71--78.
[34]
Jeroen B.J. Smeets, Mary M. Hayhoe, and Dana H. Ballard. 1996. Goal-directed arm movements change eye-head coordination. Experimental Brain Research 109, 3 (jun 1996), 434--440.
[35]
Barton A. Smith, Janet Ho, Wendy Ark, and Shumin Zhai. 2000. Hand Eye Coordination Patterns in Target Selection. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (ETRA '00). ACM, New York, NY, USA, 117--122.
[36]
Sony Computer Entertainment. 2016. PlayStation VR. Retrieved February 20, 2019 from https://www.playstation.com/en-gb/explore/playstation-vr
[37]
Martin J. Steinbach and Richard Held. 1968. Eye Tracking of Observer-Generated Target Movements. Science 161, 3837 (1968), 187--188. arXiv:http://science.sciencemag.org/content/161/3837/187.full.pdf
[38]
Yusuke Sugano and Andreas Bulling. 2015. Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 363--372.
[39]
Tobii. 2018. Tobii Pro VR Integration. Retrieved February 20, 2019 from https://www.tobiipro.com/product-listing/vr-integration/
[40]
Subarna Tripathi and Brian Guenter. 2016. A Statistical Approach to Continuous Self-Calibrating Eye Gaze Tracking for Head-Mounted Virtual Reality Systems. CoRR abs/1612.06919 (2016), 9. arXiv:1612.06919
[41]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448.
[42]
Dhanraj Vishwanath, Eileen Kowler, and Jacob Feldman. 2000. Saccadic localization of occluded targets. Vision Research 40, 20 (sep 2000), 2797--2811.
[43]
Graham Wilson, Mark McGill, Matthew Jamieson, Julie R. Williamson, and Stephen A. Brewster. 2018. Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). ACM, New York, NY, USA, Article 99, 13 pages.
[44]
Robert C. Zeleznik, Andrew S. Forsberg, and Jürgen P. Schulze. 2005. Look-That-There: Exploiting Gaze in Virtual Reality Interactions. Technical Report. Brown University.
[45]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '99). ACM, New York, NY, USA, 246--253.

Cited By

View all
  • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
  • (2024)GEARS: Generalizable Multi-Purpose Embeddings for Gaze and Hand Data in VR InteractionsProceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3627043.3659551(279-289)Online publication date: 22-Jun-2024
  • (2024)Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210630:5(2496-2506)Online publication date: May-2024
  • Show More Cited By

Index Terms

  1. Gaze behaviour on interacted objects during hand interaction in virtual reality for eye tracking calibration

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
      June 2019
      623 pages
      ISBN:9781450367097
      DOI:10.1145/3314111
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 25 June 2019

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. calibration
      2. empirical study
      3. eye tracking
      4. hand-eye coordination
      5. virtual reality

      Qualifiers

      • Research-article

      Conference

      ETRA '19

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)141
      • Downloads (Last 6 weeks)13
      Reflects downloads up to 25 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
      • (2024)GEARS: Generalizable Multi-Purpose Embeddings for Gaze and Hand Data in VR InteractionsProceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3627043.3659551(279-289)Online publication date: 22-Jun-2024
      • (2024)Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210630:5(2496-2506)Online publication date: May-2024
      • (2024)Instant interaction driven adaptive gaze control interfaceScientific Reports10.1038/s41598-024-62365-914:1Online publication date: 22-May-2024
      • (2023)A review study on eye-tracking technology usage in immersive virtual reality learning environmentsComputers & Education10.1016/j.compedu.2022.104681196:COnline publication date: 15-Feb-2023
      • (2022)Gaze-Hand AlignmentProceedings of the ACM on Human-Computer Interaction10.1145/35308866:ETRA(1-18)Online publication date: 13-May-2022
      • (2022)Exploring Gaze for Assisting Freehand Selection-based Text Entry in ARProceedings of the ACM on Human-Computer Interaction10.1145/35308826:ETRA(1-16)Online publication date: 13-May-2022
      • (2022)Robust Unsupervised Gaze Calibration Using Conversation and Manipulation Attention PriorsACM Transactions on Multimedia Computing, Communications, and Applications10.1145/347262218:1(1-27)Online publication date: 27-Jan-2022
      • (2022)Weighted Pointer: Error-aware Gaze-based Interaction through Fallback ModalitiesIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320309628:11(3585-3595)Online publication date: Nov-2022
      • (2022)A study of button size for virtual hand interaction in virtual environments based on clicking performanceMultimedia Tools and Applications10.1007/s11042-022-14038-w82:10(15903-15918)Online publication date: 15-Oct-2022
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media

      -