skip to main content
research-article

EyeSpyVR: Interactive Eye Sensing Using Off-the-Shelf, Smartphone-Based VR Headsets

Published: 05 July 2018 Publication History
  • Get Citation Alerts
  • Abstract

    Low cost virtual reality (VR) headsets powered by smartphones are becoming ubiquitous. Their unique position on the user's face opens interesting opportunities for interactive sensing. In this paper, we describe EyeSpyVR, a software-only eye sensing approach for smartphone-based VR, which uses a phone's front facing camera as a sensor and its display as a passive illuminator. Our proof-of-concept system, using a commodity Apple iPhone, enables four sensing modalities: detecting when the VR head set is worn, detecting blinks, recognizing the wearer's identity, and coarse gaze tracking - features typically found in high-end or specialty VR headsets. We demonstrate the utility and accuracy of EyeSpyVR in a series of studies with 70 participants, finding a worn detection of 100%, blink detection rate of 95.3%, family user identification accuracy of 81.4%, and mean gaze tracking error of 10.8° when calibrated to the wearer (12.9° without calibration). These sensing abilities can be used by developers to enable new interactive features and more immersive VR experiences on existing, off-the-shelf hardware.

    Supplementary Material

    ahuja (ahuja.zip)
    Supplemental movie, appendix, image and software files for, EyeSpyVR: Interactive Eye Sensing Using Off-the-Shelf, Smartphone-Based VR Headsets

    References

    [1]
    Karan Ahuja, Ruchika Banerjee, Seema Nagar, Kuntal Dey, and Ferdous Barbhuiya. 2016. Eye center localization and detection using radial mapping. In ICIP. IEEE, 3121--3125.
    [2]
    Karan Ahuja, Rahul Islam, Ferdous A Barbhuiya, and Kuntal Dey. 2017. Convolutional neural networks for ocular smartphone-based biometrics. Pattern Recognition Letters 91 (2017), 17--26.
    [3]
    Ioana Bacivarov, Mircea Ionita, and Peter Corcoran. 2008. Statistical models of appearance for eye tracking and eye-blink detection and measurement. IEEE transactions on consumer electronics 54, 3 (2008).
    [4]
    Jeremy N Bailenson, Nick Yee, Jim Blascovich, Andrew C Beall, Nicole Lundblad, and Michael Jin. 2008. The use of immersive virtual reality in the learning sciences: Digital transformations of teachers, students, and social context. The Journal of the Learning Sciences 17, 1 (2008), 102--141.
    [5]
    VR Box. 2018. VR Box Headset. (2018). https://www.amazon.com/Virtual-Reality-Headset-Glasses-Smartphones/dp/B01IXLEJCM
    [6]
    Tiago de Freitas Pereira and Séastien Marcel. 2015. Periocular biometrics in mobile environment. In Biometrics Theory, Applications and Systems (BTAS). IEEE, 1--7.
    [7]
    Maria De Marsico, Michele Nappi, Daniel Riccio, and Harry Wechsler. 2015. Mobile Iris Challenge Evaluation (MICHE)-I, biometric iris dataset and protocols. Pattern Recognition Letters 57 (2015), 17--23.
    [8]
    Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the Computer Using Gaze Gestures. In Human-Computer Interaction -- INTERACT 2007, Cécilia Baranauskas, Philippe Palanque, Julio Abascal, and Simone Diniz Junqueira Barbosa (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 475--488.
    [9]
    Richard O Duda and Peter E Hart. 1972. Use of the Hough transformation to detect lines and curves in pictures. Commun. ACM 15, 1 (1972), 11--15.
    [10]
    FOVE. 2018. FOVE eye tracker. (2018). https://www.getfove.com/
    [11]
    John M Franchak, Kari S Kretch, Kasey C Soska, and Karen E Adolph. 2011. Head-mounted eye tracking: A new method to describe infant looking. Child development 82, 6 (2011), 1738--1750.
    [12]
    Maia Garau, Mel Slater, Vinoba Vinayagamoorthy, Andrea Brogni, Anthony Steed, and M Angela Sasse. 2003. The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 529--536.
    [13]
    Ceenu Goerge, Mohamed Khamis, Emanuel von Zezschwitz, Marinus Burger, Henri Schmidt, Florian Alt, and Heinrich Hussmann. 2017. Seamless and Secure VR: Adapting and Evaluating Established Authentication Systems for Virtual Reality. In USEC.
    [14]
    Google. 2014. Google Cardboard. (2014). https://vr.google.com/cardboard/
    [15]
    Anhong Guo, Robert Xiao, and Chris Harrison. 2015. Capauth: Identifying and differentiating user handprints on commodity capacitive touchscreens. In Proceedings of the 2015 International Conference on Interactive Tabletops 8 Surfaces. ACM, 59--62.
    [16]
    Dan Witzner Hansen and Qiang Ji. 2010. In the eye of the beholder: A survey of models for eyes and gaze. IEEE transactions on pattern analysis and machine intelligence 32, 3 (2010), 478--500.
    [17]
    Steven Hickson, Nick Dufour, Avneesh Sud, Vivek Kwatra, and Irfan Essa. 2017. Eyemotion: Classifying facial expressions in VR using eye-tracking cameras. arXiv preprint arXiv:1707.07204 (2017).
    [18]
    Shoya Ishimaru, Kai Kunze, Koichi Kise, Jens Weppner, Andreas Dengel, Paul Lukowicz, and Andreas Bulling. 2014. In the blink of an eye: combining head motion and eye blink frequency for activity recognition with Google Glass. In Proceedings of the 5th augmented human international conference. ACM, 15.
    [19]
    Takehiro Ito, Shinji Mita, Kazuhiro Kozuka, Tomoaki Nakano, and Shin Yamamoto. 2002. Driver blink measurement by the motion picture processing and its application to drowsiness detection. In Intelligent Transportation Systems, 2002. Proceedings. The IEEE 5th International Conference on. IEEE, 168--173.
    [20]
    Anil Jain, Lin Hong, and Sharath Pankanti. 2000. Biometric identification. Commun. ACM 43, 2 (2000), 90--98.
    [21]
    Anuradha Kar and Peter Corcoran. 2017. A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms. IEEE Access (2017).
    [22]
    Shunichi Kasahara, Mitsuhito Ando, Kiyoshi Suganuma, and Jun Rekimoto. 2016. Parallel Eyes: Exploring Human Capability and Behaviors with Paralleled First Person View Sharing. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 1561--1572.
    [23]
    Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye tracking for everyone. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2176--2184.
    [24]
    Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. 2012. Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems. 1097--1105.
    [25]
    Aleksandra Królak and Paweł Strumiłło. 2012. Eye-blink detection system for human--computer interaction. Universal Access in the Information Society 11, 4 (2012), 409--419.
    [26]
    Marc Lalonde, David Byrns, Langis Gagnon, Normand Teasdale, and Denis Laurendeau. 2007. Real-time eye blink detection with GPU-based SIFT tracking. In Computer and Robot Vision, 2007. CRV'07. Fourth Canadian Conference on. IEEE, 481--487.
    [27]
    Tianxing Li, Emmanuel S Akosah, Qiang Liu, and Xia Zhou. 2017. Ultra-Low Power Gaze Tracking for Virtual Reality. In Proceedings of the 23rd Annual International Conference on Mobile Computing and Networking. ACM, 490--492.
    [28]
    Rui Liu, Cory Cornelius, Reza Rawassizadeh, Ron Peterson, and David Kotz. 2017. Poster: Vocal Resonance as a Passive Biometric. In Mobile Systems, Applications, and Services. ACM, 160--160.
    [29]
    Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human--computer interaction. In Advances in physiological computing. Springer, 39--65.
    [30]
    Unsang Park, Raghavender Reddy Jillela, Arun Ross, and Anil K Jain. 2011. Periocular biometrics in the visible spectrum. IEEE Transactions on Information Forensics and Security 6, 1 (2011), 96--106.
    [31]
    Anjul Patney, Marco Salvi, Joohwan Kim, Anton Kaplanyan, Chris Wyman, Nir Benty, David Luebke, and Aaron Lefohn. 2016. Towards foveated rendering for gaze-tracked virtual reality. ACM Transactions on Graphics (TOG) 35, 6 (2016), 179.
    [32]
    Thammathip Piumsomboon, Gun Lee, Robert W Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. In 3D User Interfaces (3DUI), 2017 IEEE Symposium on. IEEE, 36--39.
    [33]
    Alex Poole and Linden J Ball. 2006. Eye tracking in HCI and usability research. Encyclopedia of human computer interaction 1 (2006), 211--219.
    [34]
    Ajita Rattani, Reza Derakhshani, Sashi K Saripalle, and Vikas Gottemukkula. 2016. ICIP 2016 competition on mobile ocular biometric recognition. In Image Processing (ICIP), 2016 IEEE International Conference on. IEEE, 320--324.
    [35]
    Tsugunosuke Sakai, Haruya Tamaki, Yosuke Ota, Ryohei Egusa, Shigenori Inagaki, Fusako Kusunoki, Masanori Sugimoto, Hiroshi Mizoguchi, et al. 2017. Eda-Based Estimation Of Visual Attention By Observation Of Eye Blink Frequency. International Journal on Smart Sensing and Intelligent Systems 10, 2 (2017), 296--307.
    [36]
    Junichi Shimizu and George Chernyshov. 2016. Eye movement interactions in google cardboard using a low cost EOG setup. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. ACM, 1773--1776.
    [37]
    Justus Thies, Michael Zollhöfer, Marc Stamminger, Christian Theobalt, and Matthias Nießner. 2016. Facevr: Real-time facial reenactment and eye gaze control in virtual reality. arXiv preprint arXiv:1610.03151 (2016).
    [38]
    Tobii. 2014. Tobii eye tracker for HTC Vive. (2014). https://blog.tobii.com/eye-tracking-vr-devkit-for-htc-vive-311cbca952df
    [39]
    Eric Whitmire, Laura Trutoiu, Robert Cavin, David Perek, Brian Scally, James Phillips, and Shwetak Patel. 2016. EyeContact: scleral coil eye tracking for virtual reality. In ISWC. ACM, 184--191.
    [40]
    Yongtuo Zhang, Wen Hu, Weitao Xu, Chun Tung Chou, and Jiankun Hu. 2018. Continuous Authentication Using Eye Movement Response of Implicit Visual Stimuli. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 4 (2018), 177.

    Cited By

    View all
    • (2024)Recent Trends of Authentication Methods in Extended Reality: A SurveyApplied System Innovation10.3390/asi70300457:3(45)Online publication date: 28-May-2024
    • (2024)Ecological Validity and the Evaluation of Avatar Facial Animation Noise2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00019(72-79)Online publication date: 16-Mar-2024
    • (2024)A Systematic Review of Human Activity Recognition Based on Mobile Devices: Overview, Progress and TrendsIEEE Communications Surveys & Tutorials10.1109/COMST.2024.335759126:2(890-929)Online publication date: Oct-2025
    • Show More Cited By

    Index Terms

    1. EyeSpyVR: Interactive Eye Sensing Using Off-the-Shelf, Smartphone-Based VR Headsets

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
        Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 2, Issue 2
        June 2018
        741 pages
        EISSN:2474-9567
        DOI:10.1145/3236498
        Issue’s Table of Contents
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 05 July 2018
        Accepted: 01 April 2018
        Revised: 01 April 2018
        Received: 01 February 2018
        Published in IMWUT Volume 2, Issue 2

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. VR
        2. blink detection
        3. eye tracking
        4. gaze tracking
        5. periocular biometrics
        6. personalized service delivery on VR
        7. user identification

        Qualifiers

        • Research-article
        • Research
        • Refereed

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)63
        • Downloads (Last 6 weeks)13

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Recent Trends of Authentication Methods in Extended Reality: A SurveyApplied System Innovation10.3390/asi70300457:3(45)Online publication date: 28-May-2024
        • (2024)Ecological Validity and the Evaluation of Avatar Facial Animation Noise2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00019(72-79)Online publication date: 16-Mar-2024
        • (2024)A Systematic Review of Human Activity Recognition Based on Mobile Devices: Overview, Progress and TrendsIEEE Communications Surveys & Tutorials10.1109/COMST.2024.335759126:2(890-929)Online publication date: Oct-2025
        • (2023)Immersive Experiences and XR: A Game Engine or Multimedia Streaming Problem?SMPTE Motion Imaging Journal10.5594/JMI.2023.3269752132:5(30-37)Online publication date: Jun-2023
        • (2023)Responsibly Strategizing with the Metaverse: Business Implications and DEI Opportunities and ChallengesSSRN Electronic Journal10.2139/ssrn.4430550Online publication date: 2023
        • (2023)IMUPoser: Full-Body Pose Estimation using IMUs in Phones, Watches, and EarbudsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581392(1-12)Online publication date: 19-Apr-2023
        • (2023)Predicting Future Eye Gaze Using Inertial SensorsIEEE Access10.1109/ACCESS.2023.329241111(67482-67497)Online publication date: 2023
        • (2023)Responsibly strategizing with the metaverseThe Journal of Strategic Information Systems10.1016/j.jsis.2023.10177432:2Online publication date: 13-Jul-2023
        • (2023)Detection of Voluntary Eye Movement for Analysis About Eye Gaze Behaviour in Virtual CommunicationHCI International 2023 Posters10.1007/978-3-031-35989-7_35(273-279)Online publication date: 9-Jul-2023
        • (2022)RGBDGaze: Gaze Tracking on Smartphones with RGB and Depth DataProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556568(329-336)Online publication date: 7-Nov-2022
        • Show More Cited By

        View Options

        Get Access

        Login options

        Full Access

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media

        -