1
|
Zhang B, Zhang R, Zhao J, Yang J, Xu S. The mechanism of human color vision and potential implanted devices for artificial color vision. Front Neurosci 2024; 18:1408087. [PMID: 38962178 PMCID: PMC11221215 DOI: 10.3389/fnins.2024.1408087] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2024] [Accepted: 05/31/2024] [Indexed: 07/05/2024] Open
Abstract
Vision plays a major role in perceiving external stimuli and information in our daily lives. The neural mechanism of color vision is complicated, involving the co-ordinated functions of a variety of cells, such as retinal cells and lateral geniculate nucleus cells, as well as multiple levels of the visual cortex. In this work, we reviewed the history of experimental and theoretical studies on this issue, from the fundamental functions of the individual cells of the visual system to the coding in the transmission of neural signals and sophisticated brain processes at different levels. We discuss various hypotheses, models, and theories related to the color vision mechanism and present some suggestions for developing novel implanted devices that may help restore color vision in visually impaired people or introduce artificial color vision to those who need it.
Collapse
Affiliation(s)
- Bingao Zhang
- Key Laboratory for the Physics and Chemistry of Nanodevices, Institute of Physical Electronics, Department of Electronics, Peking University, Beijing, China
| | - Rong Zhang
- Key Laboratory for the Physics and Chemistry of Nanodevices, Institute of Physical Electronics, Department of Electronics, Peking University, Beijing, China
| | - Jingjin Zhao
- Key Laboratory for the Physics and Chemistry of Nanodevices, Institute of Physical Electronics, Department of Electronics, Peking University, Beijing, China
| | - Jiarui Yang
- Beijing Key Laboratory of Restoration of Damaged Ocular Nerve, Department of Ophthalmology, Peking University Third Hospital, Beijing, China
| | - Shengyong Xu
- Key Laboratory for the Physics and Chemistry of Nanodevices, Institute of Physical Electronics, Department of Electronics, Peking University, Beijing, China
| |
Collapse
|
2
|
Sadeghi R, Kartha A, Barry MP, Gibson P, Caspi A, Roy A, Geruschat DR, Dagnelie G. Benefits of thermal and distance-filtered imaging for wayfinding with prosthetic vision. Sci Rep 2024; 14:1313. [PMID: 38225344 PMCID: PMC10789760 DOI: 10.1038/s41598-024-51798-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2023] [Accepted: 01/09/2024] [Indexed: 01/17/2024] Open
Abstract
Visual prostheses such as the Argus II provide partial vision for individuals with limited or no light perception. However, their effectiveness in daily life situations is limited by scene complexity and variability. We investigated whether additional image processing techniques could improve mobility performance in everyday indoor environments. A mobile system connected to the Argus II provided thermal or distance-filtered video stimulation. Four participants used the thermal camera to locate a person and the distance filter to navigate a hallway with obstacles. The thermal camera allowed for finding a target person in 99% of trials, while unfiltered video led to confusion with other objects and a success rate of only 55% ([Formula: see text]). Similarly, the distance filter enabled participants to detect and avoid 88% of obstacles by removing background clutter, whereas unfiltered video resulted in a detection rate of only 10% ([Formula: see text]). For any given elapsed time, the success rate with filtered video was higher than with unfiltered video. After 90 s, participants' success rate reached above 50% with filtered video and 24% and 3% with normal camera in the first and second tasks, respectively. Despite individual variations, all participants showed significant improvement when using the thermal and distance filters compared to unfiltered video. Adding a thermal and distance filter to a visual prosthesis system can enhance the performance of mobility activities by removing clutter in the background, showing people and warm objects with the thermal camera, or nearby obstacles with the distance filter.
Collapse
Affiliation(s)
- Roksana Sadeghi
- Department of Biomedical Engineering, Johns Hopkins School of Medicine, Baltimore, MD, USA.
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, CA, USA.
| | - Arathy Kartha
- Department of Biological and Vision Sciences, State University of New York College of Optometry, New York, NY, USA
- Department of Ophthalmology, Johns Hopkins School of Medicine, Baltimore, MD, USA
| | - Michael P Barry
- Department of Ophthalmology, Johns Hopkins School of Medicine, Baltimore, MD, USA
- Pritzker Institute for Biomedical Science and Engineering, Illinois Institute of Technology, Chicago, IL, USA
| | - Paul Gibson
- Advanced Medical Electronics Corporation, Maple Grove, MN, USA
| | - Avi Caspi
- Jerusalem College of Technology, Jerusalem, Israel
| | | | - Duane R Geruschat
- Department of Ophthalmology, Johns Hopkins School of Medicine, Baltimore, MD, USA
| | - Gislin Dagnelie
- Department of Ophthalmology, Johns Hopkins School of Medicine, Baltimore, MD, USA
| |
Collapse
|
3
|
Kasowski J, Johnson BA, Neydavood R, Akkaraju A, Beyeler M. A systematic review of extended reality (XR) for understanding and augmenting vision loss. J Vis 2023; 23:5. [PMID: 37140911 PMCID: PMC10166121 DOI: 10.1167/jov.23.5.5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2022] [Accepted: 04/04/2023] [Indexed: 05/05/2023] Open
Abstract
Over the past decade, extended reality (XR) has emerged as an assistive technology not only to augment residual vision of people losing their sight but also to study the rudimentary vision restored to blind people by a visual neuroprosthesis. A defining quality of these XR technologies is their ability to update the stimulus based on the user's eye, head, or body movements. To make the best use of these emerging technologies, it is valuable and timely to understand the state of this research and identify any shortcomings that are present. Here we present a systematic literature review of 227 publications from 106 different venues assessing the potential of XR technology to further visual accessibility. In contrast to other reviews, we sample studies from multiple scientific disciplines, focus on technology that augments a person's residual vision, and require studies to feature a quantitative evaluation with appropriate end users. We summarize prominent findings from different XR research areas, show how the landscape has changed over the past decade, and identify scientific gaps in the literature. Specifically, we highlight the need for real-world validation, the broadening of end-user participation, and a more nuanced understanding of the usability of different XR-based accessibility aids.
Collapse
Affiliation(s)
- Justin Kasowski
- Graduate Program in Dynamical Neuroscience, University of California, Santa Barbara, CA, USA
| | - Byron A Johnson
- Department of Psychological & Brain Sciences, University of California, Santa Barbara, CA, USA
| | - Ryan Neydavood
- Department of Psychological & Brain Sciences, University of California, Santa Barbara, CA, USA
| | - Anvitha Akkaraju
- Department of Psychological & Brain Sciences, University of California, Santa Barbara, CA, USA
| | - Michael Beyeler
- Department of Psychological & Brain Sciences, University of California, Santa Barbara, CA, USA
- Department of Computer Science, University of California, Santa Barbara, CA, USA
| |
Collapse
|
4
|
Beyeler M, Sanchez-Garcia M. Towards a Smart Bionic Eye: AI-powered artificial vision for the treatment of incurable blindness. J Neural Eng 2022; 19:10.1088/1741-2552/aca69d. [PMID: 36541463 PMCID: PMC10507809 DOI: 10.1088/1741-2552/aca69d] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 11/28/2022] [Indexed: 11/30/2022]
Abstract
Objective.How can we return a functional form of sight to people who are living with incurable blindness? Despite recent advances in the development of visual neuroprostheses, the quality of current prosthetic vision is still rudimentary and does not differ much across different device technologies.Approach.Rather than aiming to represent the visual scene as naturally as possible, aSmart Bionic Eyecould provide visual augmentations through the means of artificial intelligence-based scene understanding, tailored to specific real-world tasks that are known to affect the quality of life of people who are blind, such as face recognition, outdoor navigation, and self-care.Main results.Complementary to existing research aiming to restore natural vision, we propose a patient-centered approach to incorporate deep learning-based visual augmentations into the next generation of devices.Significance.The ability of a visual prosthesis to support everyday tasks might make the difference between abandoned technology and a widely adopted next-generation neuroprosthetic device.
Collapse
Affiliation(s)
- Michael Beyeler
- Department of Computer Science,University of California,Santa Barbara, CA, United States of America
- Department of Psychological & Brain Sciences, University of California, Santa Barbara, CA, United States of America
| | - Melani Sanchez-Garcia
- Department of Computer Science,University of California,Santa Barbara, CA, United States of America
| |
Collapse
|
5
|
Jeganathan VSE, Lin CE, Son H, Krishnagiri DS, Wei Y, Weiland JD. Integration of artificial vision with non-visual peripheral cues to guide mobility. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2022; 2022:5136-5139. [PMID: 36086298 DOI: 10.1109/embc48229.2022.9871117] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Visual prostheses can improve vision for people with severe vision loss, but low image resolution and lack of peripheral vision limit their effectiveness. To address both problems, we developed a prototype advanced video processing system with a headworn depth camera and feature detection capabilities. We used computer vision algorithms to detect landmarks representing a goal and plan a path towards the goal, while removing unnecessary distractors from the video. If the landmark fell outside the visual prosthesis's field-of-view (20 degrees central vision) but within the camera's field-of-view (70 degrees), we provided vibrational cues to the left or right temple to guide the user in pointing the camera. We evaluated an Argus II retinal prosthesis participant with significant vision loss who could not complete the task (finding a door in a large room) with either his remaining vision or his retinal prosthesis. His success rate improved to 57%, 37.5%, and 100% while requiring 52.3, 83.0, and 58.8 seconds to reach the door using only vibration feedback, retinal prosthesis with modified video, and retinal prosthesis with modified video and vibration feedback, respectively. This case study demonstrates a possible means of augmenting artificial vision. Clinical Relevance- Retinal prostheses can be enhanced by adding computer vision and non-visual cues.
Collapse
|
6
|
Gibson PL, Hedin DS, Seifert GJ, Rydberg N, Skujins J, Boldenow P. Stereoscopic Distance Filtering Plus Thermal Imaging Glasses Design. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2022; 2022:894-897. [PMID: 36086353 DOI: 10.1109/embc48229.2022.9871280] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
The authors present the development of eyewear that incorporates stereoscopic and thermal imaging cameras for the purpose of highlighting objects/views of interest. Image processing algorithms that simplify complex elements in a scene have the ability to improve the utility of blind and low vision aids. Thermal imaging can be used to highlight important objects such as people or animals, while stereoscopic imaging can be used to filter background imagery beyond a certain distance. The methods used have been successful in providing utility to retinal prosthesis users. The stereoscopic camera systems involved strict requirements on the relative orientation of the cameras for calibrated distance filtering. A mechanical design is presented that fixes the relative camera locations on a 3D printed titanium structure that can float in the frame to maintain orientations even when the eyewear is flexed during wearing. Clinical Relevance - The design presented has utility in improving perceived spatial resolution in implantable retinal prostheses, implantable visual cortical prostheses, direct-vision prostheses, and wearable low vision aids.
Collapse
|