Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 May 2;23(5):5.
doi: 10.1167/jov.23.5.5.

A systematic review of extended reality (XR) for understanding and augmenting vision loss

Affiliations

A systematic review of extended reality (XR) for understanding and augmenting vision loss

Justin Kasowski et al. J Vis. .

Abstract

Over the past decade, extended reality (XR) has emerged as an assistive technology not only to augment residual vision of people losing their sight but also to study the rudimentary vision restored to blind people by a visual neuroprosthesis. A defining quality of these XR technologies is their ability to update the stimulus based on the user's eye, head, or body movements. To make the best use of these emerging technologies, it is valuable and timely to understand the state of this research and identify any shortcomings that are present. Here we present a systematic literature review of 227 publications from 106 different venues assessing the potential of XR technology to further visual accessibility. In contrast to other reviews, we sample studies from multiple scientific disciplines, focus on technology that augments a person's residual vision, and require studies to feature a quantitative evaluation with appropriate end users. We summarize prominent findings from different XR research areas, show how the landscape has changed over the past decade, and identify scientific gaps in the literature. Specifically, we highlight the need for real-world validation, the broadening of end-user participation, and a more nuanced understanding of the usability of different XR-based accessibility aids.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
PRISMA flow diagram. The results from three databases (Google Scholar, IEEE Xplore, and PubMed) were searched to identify work that combined XR technology with low-vision research. After removing duplicates, improperly dated studies, and studies that did not involve human subjects research, we ended up with 227 articles to be included in the review.
Figure 2.
Figure 2.
Corpus of identified articles presented chronologically from left to right. Each circle is a paper (size: number of citations), and some highly cited papers are highlighted with an inset illustration. Papers are organized vertically based on title similarity. An interactive version of the map is available at https://app.litmaps.com/shared/map/CE0C5D29-8F18-4F2D-9866-0BE1EA4AF288.
Figure 3.
Figure 3.
The 227 articles included in this review were manually assessed and categorized by (a) whether the end users were people with low vision (defined as having some residual light perception) or people who were totally blind (no light perception), (b) whether the article used XR technology to study visual perception and behavior or proposed a new XR augmentation technology, and (c) whether the article involved BLV end users, simulations of the relevant impairment condition, or both.
Figure 4.
Figure 4.
OpenVisSim conditions. (A) For a given fixation location (red cross), an example of simulated peripheral vision loss (“tunnel vision”) is shown. (B) Examples of visual changes associated with various low-vision conditions (reprinted under CC-BY from Jones et al., 2020).
Figure 5.
Figure 5.
Examples of augmented reality in a head-mounted display. (A) “RealSense” is able to detect and highlight the traversable area in a variety of structured indoor environments (reprinted under CC-BY from Yang et al., 2016). (B) A depth camera designed for detecting people and obstacles while walking (reprinted under CC-BY from Hicks et al., 2013).
Figure 6.
Figure 6.
Examples of augmented reality systems used to simulate prosthetic vision with sighted participants. (A) AR glasses for mimicking the prosthetic vision seen by a participant with geographic atrophy (reprinted under CC-BY from Ho et al., 2019). The front camera of the AR glasses captured the video stream, while custom software preloaded on the glasses adjusted the video quality to mimic prosthetic vision (bottom). (B) AR system to evaluate the benefit of gaze compensation on hand–eye coordination (reprinted under CC-BY from Titchener, Shivdasani, Fallon, & Petoe, 2018). Phosphenes were rendered as Gaussian blobs (top). Participants wore a simulated prosthetic vision headset that included a front-facing camera, head motion tracker, and eye tracker (bottom). (C) Simulated prosthetic vision in retinitis pigmentosa. Residual vision covers the central 10o field of view, and simulated electrode arrays provide bionic vision in the degenerated periphery (reprinted under CC-BY from Zapf, Boon, Matteucci, Lovell, & Suaning, 2015).

Similar articles

Cited by

References

    1. Addleman, D. A., Legge, G. E., & Jiang, Y. (2021). Simulated central vision loss impairs implicit location probability learning. Cortex, 138, 241–252. - PubMed
    1. Ahmetovic, D., Guerreiro, J., Ohn-Bar, E., Kitani, K. M., & Asakawa, C. (2019). Impact of expertise on interaction preferences for navigation assistance of visually impaired in dividuals. In Proceedings of the 16th International Web for All Conference, W4A ’19 (pp. 1–9). New York, NY: Association for Computing Machinery.
    1. Al-Atabany, W., Al Yaman, M., & Degenaar, P. (2018). Extraspectral imaging for improving the perceived information presented in retinal prosthesis. Journal of Healthcare Engineering, 2018, e3493826. - PMC - PubMed
    1. Alberti, C. F. & Bex, P. J. (2018). Binocular contrast summation and inhibition depends on spatial frequency, eccentricity and binocular disparity. Ophthalmic & Physiological Optics: The Journal of the British College of Ophthalmic Opticians (Optometrists), 38(5), 525–537. - PMC - PubMed
    1. Alberti, C. F., Horowitz, T., Bronstad,M., & Bowers, A. R. (2014). Visual attention measures predict pedestrian detection in central field loss: A pilot study. PLoS One, 9(2), e89381. - PMC - PubMed

Publication types

-