skip to main content
10.1145/3314111.3319848acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Guiding gaze: expressive models of reading and face scanning

Published: 25 June 2019 Publication History

Abstract

We evaluate subtle, emotionally-driven models of eye movement animation. Two models are tested, reading and face scanning, each based on recorded gaze transition probabilities. For reading, simulated emotional mood is governed by the probability density function that varies word advancement, i.e., re-fixations, forward, or backward skips. For face scanning, gaze behavior depends on task (gender or emotion discrimination) or the facial emotion portrayed. Probability density functions in both cases are derived from empirically observed transitions that significantly alter viewing behavior, captured either during mood-induced reading or during scanning faces expressing different emotions. A perceptual study shows that viewers can distinguish between reading and face scanning eye movements. However, viewers could not gauge the emotional valence of animated eye motion. For animation, our contribution shows that simulated emotionally-driven viewing behavior is too subtle to be discerned, or it needs to be exaggerated to be effective.

References

[1]
Bahill, A. T., Clark, M. R., and Stark, L. (1975). The Main Sequence, A Tool for Studying Human Eye Movements. Mathematical Biosciences, 24(3/4):191--204.
[2]
Baloh, R. W., Sills, A. W., Kumley, W. E., and Honrubia, V. (1975). Quantitative measurement of saccade amplitude, duration, and velocity. Neurology, 25:1065--1070.
[3]
Biele, C., Kopacz, A., and Krejtz, K. (2013). Shall We Care About the User's Feelings?: Influence of Affect and Engagement on Visual Attention. In Proceedings of the International Conference on Multimedia, Interaction, Design and Innovation, MIDI '13, pages 7:1--7:8, New York, NY. ACM.
[4]
Borland, D., Peck, T., and Slater, M. (2013). An Evaluation of Self-Avatar Eye Movement for Virtual Embodiment. IEEE Transactions on Visualization and Computer Graphics, 19(4):591--596.
[5]
Buchan, J. N., Paré, M., and Munhall, K. G. (2007). Spatial statistics of gaze fixations during dynamic face processing. Social Neuroscience, 2(1):1--13.
[6]
Campbell, C. S. and Maglio, P. P. (2001). A robust algorithm for reading detection. In ACM Workshop on Perceptive User Interfaces, pages 1--7. ACM Press.
[7]
Carrell, P. L., Devine, J., and Eskey, D. E., editors (1998). Interactive Approaches to Second Language Reading. Cambridge University Press, Cambridge, UK.
[8]
Chuk, T., Chan, A. B., and Hsiao, J. H. (2014). Understanding eye movements in face recognition using hidden Markov models. Journal of Vision, 14(11):8.
[9]
Duchowski, A., Jörg, S., Lawson, A., Bolte, T., Świrski, L., and Krejtz, K. (2015). Eye Movement Synthesis with 1 /f Pink Noise. In Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games, MIG '15, pages 47--56, NewYork, NY. ACM.
[10]
Duchowski, A. T., Jörg, S., Allen, T. N., Giannopoulos, I., and Krejtz, K. (2016). Eye Movement Synthesis. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ETRA '16, pages 147--154, New York, NY. ACM.
[11]
Eisenbarth, H. and Alpers, G. W. (2011). Happy mouth and sad eyes: scanning emotional facial expressions. Emotion, 11(4):860.
[12]
Ellis, S. R. and Stark, L. (1986). Statistical Dependency in Visual Scanning. Human Factors, 28(4):421--438.
[13]
Engbert, R., Nuthmann, A., Richter, E. M., and Kliegl, R. (2005). SWIFT: A Dynamical Model of Saccade Generation During Reading. Psychological Review, 112(4):777--813.
[14]
Garau, M., Slater, M., Vinayagamoorthy, V., Brogni, A., Steed, A., and Sasse, M. A. (2003). The Impact of Avatar Realism and Eye Gaze Control on Perceived Quality of Communication in a Shared Immersive Virtual Environment. In Human Factors in Computing Systems: CHI 03 Conference Proceedings, pages 529--536. ACM Press.
[15]
Gehrer, N. A., Schönenberg, M., Duchowski, A. T., and Krejtz, K. (2018). Implementing Innovative Gaze Analytic Methods in Clinical Psychology: A Study on Eye Movements in Antisocial Violent Offenders. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, ETRA '18, pages 41:1--41:9, New York, NY. ACM.
[16]
Gerrards-Hesse, A., Spies, K., and Hesse, F. W. (1994). Experimental inductions of emotional states and their effectiveness: A review. British Journal of Psychology, 85(1):55--78.
[17]
Goeleven, E., De Raedt, R., Leyman, L., and Verschuere, B. (2008). The Karolinska directed emotional faces: a validation study. Cognition and Emotion, 22(6):1094--1118.
[18]
Grillon, H. and Thalmann, D. (2009). Simulating Gaze Attention Behaviors for Crowds. Computer Animation and Virtual Worlds, 20(23):111--119.
[19]
Han, Y., Chen, B., and Zhang, X. (2017). Sex Difference of Saccade Patterns in Emotional Facial Expression Recognition. In Sun, F., Liu, H., and Hu, D., editors, Cognitive Systems and Signal Processing, pages 144--154, Singapore. Springer Singapore.
[20]
Hessels, R. S., Holleman, G. A., Cornelissen, T. H. W., Hooge, I. T. C., and Kemner, C. (2018). Eye contact takes two - autistic and social anxiety traits predict gaze behavior in dyadic interaction. Journal of Experimental Psychopathology, 9(2):1--17.
[21]
Hessels, R. S., Holleman, G. A., Kingstone, A., Hooge, I. T. C., and Kemner, C. (2019). Gaze allocation in face-to-face communication is affected primarily by task structure and social context, not stimulus-driven factors. Cognition, 184:28--43.
[22]
Hume, E. and Mailhot, F. (2013). The Role of Entropy and Surprisal in Phonologization and Language Change. In Yu, A. C. L., editor, Origins of Sound Change: Approaches to Phonologization, pages 29--47. Oxford University Press, Oxford, UK.
[23]
Jörg, S., Duchowski, A. T., Krejtz, K., and Niedzielska, A. (2018). Perceptual Adjustment of Eyeball Rotation and Pupil Size Jitter for Virtual Characters. ACM Transactions on Applied Perception, 15(4):24:1--24:13.
[24]
Just, M. A. and Carpenter, P. A. (1976). Eye Fixations and Cognitive Processes. Cognitive Psychology, 8(4):441--480.
[25]
Just, M. A. and Carpenter, P. A. (1980). A theory of reading: From eye fixations to comprehension. Psychological Review, 87(4):329--354.
[26]
Kashihara, K., Okanoya, K., and Kawai, N. (2014). Emotional attention modulates microsaccadic rate and direction. Psychological Research, 78:166--179.
[27]
Konečni, V. J. (2008). Does Music Induce Emotion? A Theoretical and Methodological Analysis. Psychology of Aesthetics, Creativity, and the Arts, 2(2):115--129.
[28]
Krejtz, K., Duchowski, A., Szmidt, T., Krejtz, I., González Perilli, F., Pires, A., Vilaro, A., and Villalobos, N. (2015). Gaze transition entropy. ACM Transactions on Applied Perception, 13(1):4:1--4:20.
[29]
Krejtz, K., Duchowski, A., Zhou, H., Jörg, S., and Niedzielska, A. (2017). Perceptual Evaluation of Synthetic Gaze Jitter. Computer Animation and Virtual Worlds, pages e1745--n/a. e1745 cav.1745.
[30]
Lance, B. J. and Marsella, S. C. (2008). A model of gaze for the purpose of emotional expression in virtual embodied agents. In Proceedings of the 7th International Joint Conference on Autonomous Agents and Multiagent Systems - Volume 1, pages 199--206.
[31]
Lee, S. P., Badler, J. B., and Badler, N. I. (2002). Eyes alive. ACM Transactions on Graphics, 21(3):637--644.
[32]
Li, Z. and Mao, X. (2012). Emotional eye movement generation based on geneva emotion wheel for virtual agents. Journal of Visual Languages and Computing, 23(5):299--310.
[33]
Lundqvist, D., Flykt, A., and Öhman, A. (1998). The Karolinska directed emotional faces (KDEF). CD ROM from Department of Clinical Neuroscience, Psychology section, Karolinska Institutet, (1998).
[34]
Martin, M. (1990). On the Induction of Mood. Clinical Psychology Review, 10:669--697.
[35]
Martinez-Conde, S., Macknik, S. L., and Troncoso, Xoana G. Hubel, D. H. (2009). Microsaccades: a neurophysiological analysis. Trends in Neurosciences, 32(9):463--475.
[36]
Mori, M. (1970). The Uncanny Valley. Energy, 7(4):33--35.
[37]
Murray, N., Roberts, D., Steed, A., Sharkey, P., Dickerson, P., Rae, J., and Wolff, R. (2009). Eye gaze in virtual environments: evaluating the need and initial work on implementation. Concurrency and Computation: Practice and Experience, 21:1437--1449.
[38]
Normoyle, A., Badler, J. B., Fan, T., Badler, N. I., Cassol, V. J., and Musse, S. R. (2013). Evaluating perceived trust from procedurally animated gaze. In Proceedings of Motion in Games, MIG '13, pages 119:141--119:148.
[39]
Peirce, J. W. (2007). PsychoPy-Psychophysics Software in Python. Journal of Neuroscience Methods, 162(1):8--13.
[40]
Peterson, M. F., Abbey, C. K., and Eckstein, M. P. (2009). The Surprisingly High Human Efficiency at Learning to Recognize Faces. Vision Research, 49(3):301--314.
[41]
Queiroz, R. B., Barros, L. M., and Musse, S. R. (2008). Providing expressive gaze to virtual animated characters in interactive applications. Comput. Entertain., 6(3):41:1--41:23.
[42]
R Core Team (2016). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria.
[43]
Rayner, K. (1998). Eye Movements in Reading and Information Processing: 20 Years of Research. Psychological Bulletin, 124(3):372--422.
[44]
Rayner, K. (2009). Eye Movements in Reading: Models and Data. Journal of Eye Movement Research, 2(5):1--10.
[45]
Reichle, E. D., Pollatsek, A., and Rayner, K. (2006). E-Z Reader: A cognitive-control, serial-attention model of eye-movement behavior during reading. Cognitive Systems Research, 7:4--22.
[46]
Richell, R. A. and Anderson, M. J. (2004). Reproducibility of Negative Mood Induction: A Self-Referent Plus Musical Mood Induction Procedure and a Controllable/Uncontrollable Stress Paradigm. Journal of Psychopharmacology, 18(1):94--101.
[47]
Robinson, Oliver, J., Grillon, C., and Sahakian, B. J. (2012). The Mood Induction Task: A standardized, computerized laboratory procedure for altering mood state in humans. Protocol Exchange.
[48]
Ruhland, K., Andrist, S., Badler, J. B., Peters, C. E., Badler, N. I., Gleicher, M., Mutlu, B., and McDonnell, R. (2014). Look me in the eyes: A survey of eye and gaze animation for virtual agents and artificial systems. In Lefebvre, S. and Spagnuolo, M., editor, Computer Graphics Forum, pages 69--91. EuroGraphics STAR---State of the Art Report.
[49]
Stark, L., Campbell, F. W., and Atwood, J. (1958). Pupil Unrest: An Example of Noise in a Biological Servomechanism. Nature, 182(4639):857--858.
[50]
Suppes, P. (1990). Eye-movement models for arithmetic and reading performance. In Kowler, E., editor, Eye movements and their role in visual and cognitive processes, volume IV of Reviews of Oculomotor Research, pages 455--477. Elsevier, Amsterdam, Netherlands.
[51]
Suppes, P. (1994). Stochastic models of reading. In Ygge, J. and Lennerstrand, G., editors, Eye Movements in Reading, pages 349--364. Pergamon, Oxford, UK.
[52]
Świrski, L. and Dodgson, N. (2014). Rendering synthetic ground truth images for eye tracker evaluation. In Eye Tracking Research and Applications, ETRA '14, pages 219--222. ACM.
[53]
Templin, K., Didyk, P., Myszkowski, K., Hefeeda, M. M., Seidel, H.-P., and Matusik, W. (2014). Modeling and Optimizing Eye Vergence Response to Stereoscopic Cuts. ACM Transactions on Graphics (Proc. SIGGRAPH), 33(4).
[54]
Thibadeau, R., Just, M. A., and Carpenter, P. A. (1982). A Model of the Time Course and Content of Reading. Cognitive Science, 6:157--203.
[55]
Velten, Jr., E. (1968). A Laboratoray Task for Induction of Mood States. Behavior Research & Therapy, 6:473--482.
[56]
Wells, L. J., Gillespie, S. M., and Rotshtein, P. (2016). Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze. PLOS ONE, 11(12):1--20.
[57]
Yeo, S. H., Lesmana, M., Neog, D. R., and Pai, D. K. (2012). Eyecatch: Simulating Visuomotor Coordination for Object Interception. ACM Transactions on Graphics, 31(4):42:1--42:10.

Cited By

View all
  • (2024)S3: Speech, Script and Scene driven Head and Eye AnimationACM Transactions on Graphics10.1145/365817243:4(1-12)Online publication date: 19-Jul-2024
  • (2023)A Unified Look at Cultural Heritage: Comparison of Aggregated Scanpaths over Architectural ArtifactsProceedings of the ACM on Human-Computer Interaction10.1145/35911387:ETRA(1-17)Online publication date: 18-May-2023
  • (2023)Unconscious Frustration: Dynamically Assessing User Experience using Eye and Mouse TrackingProceedings of the ACM on Human-Computer Interaction10.1145/35911377:ETRA(1-17)Online publication date: 18-May-2023
  • Show More Cited By

Index Terms

  1. Guiding gaze: expressive models of reading and face scanning

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
    June 2019
    623 pages
    ISBN:9781450367097
    DOI:10.1145/3314111
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 June 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. character animation
    2. eye motion
    3. face scanning
    4. reading

    Qualifiers

    • Research-article

    Conference

    ETRA '19

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)20
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 25 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)S3: Speech, Script and Scene driven Head and Eye AnimationACM Transactions on Graphics10.1145/365817243:4(1-12)Online publication date: 19-Jul-2024
    • (2023)A Unified Look at Cultural Heritage: Comparison of Aggregated Scanpaths over Architectural ArtifactsProceedings of the ACM on Human-Computer Interaction10.1145/35911387:ETRA(1-17)Online publication date: 18-May-2023
    • (2023)Unconscious Frustration: Dynamically Assessing User Experience using Eye and Mouse TrackingProceedings of the ACM on Human-Computer Interaction10.1145/35911377:ETRA(1-17)Online publication date: 18-May-2023
    • (2023)Practical Perception-Based Evaluation of Gaze Prediction for Gaze Contingent RenderingProceedings of the ACM on Human-Computer Interaction10.1145/35911347:ETRA(1-17)Online publication date: 18-May-2023
    • (2023)Investigating Privacy Perceptions and Subjective Acceptance of Eye Tracking on Handheld Mobile DevicesProceedings of the ACM on Human-Computer Interaction10.1145/35911337:ETRA(1-16)Online publication date: 18-May-2023
    • (2023)Exploring Gaze-assisted and Hand-based Region Selection in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/35911297:ETRA(1-19)Online publication date: 18-May-2023
    • (2023)DynamicRead: Exploring Robust Gaze Interaction Methods for Reading on Handheld Mobile Devices under Dynamic ConditionsProceedings of the ACM on Human-Computer Interaction10.1145/35911277:ETRA(1-17)Online publication date: 18-May-2023
    • (2022)Entropy of eye movements while reading code or textProceedings of the Tenth International Workshop on Eye Movements in Programming10.1145/3524488.3527365(8-14)Online publication date: 19-May-2022
    • (2022)Effectiveness of Periocular Biometric Recognition Under Face Mask RestrictionsBreakthroughs in Digital Biometrics and Forensics10.1007/978-3-031-10706-1_11(241-255)Online publication date: 15-Oct-2022
    • (2020)Accurate Real‐time 3D Gaze Tracking Using a Lightweight Eyeball CalibrationComputer Graphics Forum10.1111/cgf.1394539:2(475-485)Online publication date: 13-Jul-2020

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media

    -