skip to main content
research-article

Empowering individuals with disabilities: a real-time, cost-effective, calibration-free assistive system utilizing eye tracking

Published: 20 May 2024 Publication History
  • Get Citation Alerts
  • Abstract

    Recent innovations in real-time eye-tracking technology enhance accessibility, offering individuals with disabilities effective computer engagement. This study presents a cost-effective, calibration-free, eye-controlled system comprising two phases: scaling and feature extraction, followed by coordinate mapping. In the first phase, the MediaPipe framework extracts features, and scaling adapts to the display size. The second phase computes parameters for accurate mapping between the user’s iris and screen coordinates. MediaPipe’s pre-trained models and optimized architecture improve system efficiency and real-time performance, reducing the need for extensive dataset training. Adaptive scaling and iris detection optimizations enhance computational efficiency and responsiveness in real-time applications. The system is budget friendly, costing $100 or less, with a user-friendly Graphical User Interface (GUI) meeting essential daily requirements for individuals with disabilities. A total of 30 participants have been recruited for system testing, including disabled and non-disabled. The system takes an average of 0.0492127 s to process a frame while video acquisition, face and facial features tracking, and iris plotting on screen take 0.0333667, 0.0086365, and 0.0082976 s per frame, respectively. The system has achieved a mean typing speed of 21.7 and 15.8 CPM (characters per minute) for non-disabled and disabled participants, respectively. In addition, the system has demonstrated an average pixel accuracy of 25.2 pixels for non-disabled individuals and 29.32 pixels for disabled individuals. A system usability test exclusively involving disabled participants yielded promising results, with an average score of 90.6. The proposed eye-controlled system operates in real time, showcasing its responsiveness and effectiveness in enhancing computer accessibility for individuals with limited mobility or disabilities.

    References

    [1]
    Blignaut P Development of a gaze-controlled support system for a person in an advanced stage of multiple sclerosis: a case study Univ. Access Inf. Soc. 2017 16 4 1003-1016
    [2]
    Zarei S, Carr K, Reiley L, Diaz K, Guerra O, Altamirano PF, Pagani W, Lodin D, Orozco G, and Chinea A A comprehensive review of amyotrophic lateral sclerosis Surg. Neurol. Int. 2015 6 171-194
    [3]
    Mustaquim, M.: Gaze interaction–a challenge for inclusive design. In: International Conference on Innovative Computing Technology, pp. 244–250 (2011), Springer, Berlin, Heidelberg.
    [4]
    Saikia N, Bora JK, Jasilionis D, and Shkolnikov VM Disability divides in India: evidence from the 2011 census PLoS ONE 2016 11 8
    [5]
    Chhimpa GR, Kumar A, Garhwal S, and Dhiraj Development of a real-time eye movement-based computer interface for communication with improved accuracy for disabled people under natural head movements J. Real-Time Image Process. 2023 20 4 81
    [6]
    Wu T, Wang P, Lin Y, and Zhou C A robust noninvasive eye control approach for disabled people based on Kinect 2.0 sensor IEEE Sens. Lett. 2017 1 4 1-4
    [7]
    Porta M, Dondi P, Pianetta A, and Cantoni V SPEye: a calibration-free gaze-driven text entry technique based on smooth pursuit IEEE Trans. Human-Mach. Syst. 2021 52 2 312-323
    [8]
    Chew, M.T., Penver, K.: Low-cost eye gesture communication system for people with motor disabilities. In: 2019 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), pp. 1–5 (2019)
    [9]
    Sánchez-Brizuela G, Cisnal A, de la Fuente-López E, Fraile JC, and Pérez-Turiel J Lightweight real-time hand segmentation leveraging MediaPipe landmark detection Virtual Reality 2023 27 4 3125-3132
    [10]
    Cecotti H A multimodal gaze-controlled virtual keyboard IEEE Trans. Human-Mach. Syst. 2016 46 4 601-606
    [11]
    Klaib AF, Alsrehin NO, Melhem WY, and Bashtawi HO IoT smart home using eye tracking and voice interfaces for elderly and special needs people J. Commun. 2019 14 7 614-621
    [12]
    Edughele HO, Zhang Y, Muhammad-Sukki F, Vien QT, Morris-Cafiero H, and Agyeman MO Eye-tracking assistive technologies for individuals with amyotrophic lateral sclerosis IEEE Access 2022 10 41952-41972
    [13]
    Hooge I, Holmqvist K, and Nyström M The pupil is faster than the corneal reflection (CR): are video based pupil-CR eye trackers suitable for studying detailed dynamics of eye movements? Vis. Res. 2016 128 6-18
    [14]
    Liu J, Chi J, Yang H, and Yin X In the eye of the beholder: a survey of gaze tracking techniques Pattern Recognit. 2022 132 108944
    [15]
    Sharma A and Abrol P Eye gaze techniques for human computer interaction: a research survey Int. J. Comput. Appl. 2013 71 9 18-25
    [16]
    Cheng, Y., Wang, H., Bao, Y., Lu, F.: Appearance-based gaze estimation with deep learning: a review and benchmark. arXiv preprint arXiv:2104.12668 (2021). Accessed 16 Dec 2023
    [17]
    Królak A and Strumiłło P Eye-blink detection system for human–computer interaction Univ. Access Inf. Soc. 2012 11 4 409-419
    [18]
    Bian ZP, Hou J, Chau LP, and Magnenat-Thalmann N Facial position and expression-based human–computer interface for persons with tetraplegia IEEE J. Biomed. Health Inform. 2015 20 3 915-924
    [19]
    Nam Y, Koo B, Cichocki A, and Choi S GOM-Face: GKP, EOG, and EMG-based multimodal interface with application to humanoid robot control IEEE Trans. Biomed. Eng. 2013 61 2 453-462
    [20]
    Mihajlović V, Grundlehner B, Vullers R, and Penders J Wearable, wireless EEG solutions in daily life applications: what are we missing? IEEE J. Biomed. Health Inform. 2014 19 1 6-21
    [21]
    Mak J and Wolpaw J Clinical applications of brain-computer interfaces: current state and future prospects” IEEE Rev. Biomed. Eng. 2009 2 187-199
    [22]
    MacKenzie IS Ram K Evaluating eye tracking systems for computer input” Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies 2012 Hershey, PA, USA IGI Global 205-225
    [23]
    Huo X, Park H, Kim J, and Ghovanloo M A dual-mode human computer interface combining speech and tongue motion for people with severe disabilities IEEE Trans. Neural Syst. Rehabil. Eng. 2013 21 6 979-991
    [24]
    Betke M, Gips J, and Fleming P The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities IEEE Trans. Neural Syst. Rehabil. Eng. 2002 10 1 1-10
    [25]
    Tu, J., Huang, T., Tao, H.: Face as mouse through visual face tracking. In: The 2nd Canadian Conference on Computer and Robot Vision (CRV'05), pp. 339–346 (2005).
    [26]
    Bozomitu RG, Păsărică A, Tărniceriu D, and Rotariu C Development of an eye tracking-based human-computer interface for real-time applications Sensors 2019 19 16 3630
    [27]
    Bisen D, Shukla R, Rajpoot N, Maurya P, and Uttam AK Responsive human-computer interaction model based on recognition of facial landmarks using machine learning algorithms Multimed. Tools Appl. 2022 81 13 18011-18031
    [28]
    Zhang X, Liu X, Yuan SM, and Lin SF Eye tracking based control system for natural human-computer interaction Comput. Intell. Neurosci. 2017
    [29]
    GitHub, “MasterLomaster/bkb,” 2015, https://github.com/MastaLomaster/bkb. Accessed 16 Dec 2023
    [30]
    Hossieny RR, Tantawi M, Shedeed H, and Tolba MF Development of electrooculogram based human computer interface system using deep learning Bull. Electr. Eng. Inform. 2023 12 4 2410-2420
    [31]
    Lee KR, Chang WD, Kim S, and Im CH Real-time “eye-writing” recognition using electrooculogram IEEE Trans. Neural Syst. Rehabil. Eng. 2016 25 1 37-48
    [32]
    Verbaarschot C, Tump D, Lutu A, Borhanazad M, Thielen J, van den Broek P, Farquhar J, Weikamp J, Raaphorst J, Groothuis JT, and Desain P A visual brain-computer interface as communication aid for patients with amyotrophic lateral sclerosis Clin. Neurophysiol. 2021 132 10 2404-2415
    [33]
    Zhang C, Yao R, and Cai J Efficient eye typing with 9-direction gaze estimation Multimed. Tools Appl. 2018 77 15 19679-19696
    [34]
    Huang J, Zhang Z, Xie G, and He H Real-time precise human-computer interaction system based on gaze estimation and tracking Wirel. Commun. Mob. Comput. 2021
    [35]
    Donuk K, Ari A, and Hanbay D A CNN based real-time eye tracker for web mining applications Multimed. Tools Appl. 2022
    [36]
    Ansari MF, Kasprowski P, and Peer P Person-specific gaze estimation from low-quality webcam images Sensors 2023 23 8 4138
    [37]
    LRD, M., Mukhopadhyay, A., Biswas, P.: Distraction detection in automotive environment using appearance-based gaze estimation. In: 27th International Conference on Intelligent User Interfaces, pp. 38–41(2022)
    [38]
    Jeevithashree DV, Jain P, Mukhopadhyay A, Saluja KPS, and Biswas P Eye gaze controlled adaptive virtual keyboard for users with SSMI Technol. Disabil. 2021 33 4 319-338
    [39]
    MS Windows NT kernel description, https://google.github.io/mediapipe/solutions/face_mesh.html. Accessed 10 June 2023
    [40]
    Shriram S, Nagaraj B, Jaya J, Shankar S, and Ajay P Deep learning-based real-time AI virtual mouse system using computer vision to avoid COVID-19 spread J. Healthc. Eng. 2021
    [41]
    Ablavatski, A., Vakunov, A., Grishchenko, I., Raveendran, K., Zhdanovich, M.: Real-time pupil tracking from monocular video for digital puppetry. arXiv preprint arXiv:2006.11341 (2020). Accessed 16 Dec 2023
    [42]
    Kartynnik, Y., Ablavatski, A., Grishchenko, I., Grundmann, M.: Real-time facial surface geometry from monocular video on mobile GPUs. arXiv preprint arXiv:1907.06724 (2019). Accessed 16 Dec 2023
    [43]
    Caporusso, N., Sanders, G., Thaman, B., Hall, E.: An eye-tracking solution using consumer grade webcams for potential concussion diagnosis and evaluation. In: 2023 46th MIPRO ICT and Electronics Convention (MIPRO), pp. 67–72 (2023)
    [44]
    Bergmanson JP and Martinez JG Size does matter: what is the corneo-limbal diameter? Clin. Exp. Optom. 2017 100 5 522-528
    [45]
    Jenkins FA and White HE Fundamentals of optics 2002 New York McGraw-Hill Companies
    [46]
    Chang S, Siu MFF, and Li H Development of a fuzzy logic controller for autonomous navigation of building inspection robots in unknown environments J. Comput. Civ. Eng. 2023 37 4 04023014
    [47]
    Soukoreff, R.W., MacKenzie, I.S.: Metrics for text entry research: an evaluation of MSD and KSPC, and a new unified error metric. In: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 113–120 (2003)
    [48]
    Kar A and Corcoran P Performance evaluation strategies for eye gaze estimation systems with quantitative metrics and visualizations Sensors 2018 18 9 3151
    [49]
    Bangor A, Kortum P, and Miller J Determining what individual SUS scores mean: adding an adjective rating scale J. Usability Stud. 2009 4 3 114-123

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Journal of Real-Time Image Processing
    Journal of Real-Time Image Processing  Volume 21, Issue 3
    Jun 2024
    509 pages

    Publisher

    Springer-Verlag

    Berlin, Heidelberg

    Publication History

    Published: 20 May 2024
    Accepted: 09 May 2024
    Received: 06 February 2024

    Author Tags

    1. Human–computer interaction
    2. Calibration-free
    3. Eye tracking
    4. Disabled people
    5. Low-cost interface

    Qualifiers

    • Research-article

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 0
      Total Downloads
    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0

    Other Metrics

    Citations

    View Options

    View options

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media

    -