1
|
Yang A, Tamkittikhun N, Hamilton-Fletcher G, Ramdhanie V, Vu T, Beheshti M, Hudson T, Vedanthan R, Riewpaiboon W, Mongkolwat P, Feng C, Rizzo JR. Evaluating the efficacy of UNav: A computer vision-based navigation aid for persons with blindness or low vision. Assist Technol 2024:1-15. [PMID: 39137956 DOI: 10.1080/10400435.2024.2382113] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/09/2024] [Indexed: 08/15/2024] Open
Abstract
UNav is a computer-vision-based localization and navigation aid that provides step-by-step route instructions to reach selected destinations without any infrastructure in both indoor and outdoor environments. Despite the initial literature highlighting UNav's potential, clinical efficacy has not yet been rigorously evaluated. Herein, we assess UNav against standard in-person travel directions (SIPTD) for persons with blindness or low vision (PBLV) in an ecologically valid environment using a non-inferiority design. Twenty BLV subjects (age = 38 ± 8.4; nine females) were recruited and asked to navigate to a variety of destinations, over short-range distances (<200 m), in unfamiliar spaces, using either UNav or SIPTD. Navigation performance was assessed with nine dependent variables to assess travel confidence, as well as spatial and temporal performances, including path efficiency, total time, and wrong turns. The results suggest that UNav is not only non-inferior to the standard-of-care in wayfinding (SIPTD) but also superior on 8 out of 9 metrics, as compared to SIPTD. This study highlights the range of benefits computer vision-based aids provide to PBLV in short-range navigation and provides key insights into how users benefit from this systematic form of computer-aided guidance, demonstrating transformative promise for educational attainment, gainful employment, and recreational participation.
Collapse
Affiliation(s)
- Anbang Yang
- Department of Mechanical and Aerospace Engineering, NYU Tandon School of Engineering, Brooklyn, New York, USA
| | - Nattachart Tamkittikhun
- Faculty of Information and Communication Technology, Mahidol University, Nakhon Pathom, Thailand
| | - Giles Hamilton-Fletcher
- Department of Rehabilitation Medicine, NYU Grossman School of Medicine, New York, New York, USA
- Department of Ophthalmology, NYU Grossman School of Medicine, New York, New York, USA
| | - Vinay Ramdhanie
- Department of Biomedical Engineering, NYU Tandon School of Engineering, Brooklyn, New York, USA
| | - Thu Vu
- Department of Computer Science and Engineering, NYU Tandon School of Engineering, Brooklyn, New York, USA
| | - Mahya Beheshti
- Department of Rehabilitation Medicine, NYU Grossman School of Medicine, New York, New York, USA
| | - Todd Hudson
- Department of Rehabilitation Medicine, NYU Grossman School of Medicine, New York, New York, USA
| | - Rajesh Vedanthan
- Department of Population Health, NYU Grossman School of Medicine, New York, NY USA
| | - Wachara Riewpaiboon
- Ratchasuda Institute, Faculty of Medicine Ramathibodi Hospital, Mahidol University, Nakhon Pathom, Thailand
| | - Pattanasak Mongkolwat
- Faculty of Information and Communication Technology, Mahidol University, Nakhon Pathom, Thailand
| | - Chen Feng
- Department of Mechanical and Aerospace Engineering, NYU Tandon School of Engineering, Brooklyn, New York, USA
| | - John-Ross Rizzo
- Department of Mechanical and Aerospace Engineering, NYU Tandon School of Engineering, Brooklyn, New York, USA
- Department of Rehabilitation Medicine, NYU Grossman School of Medicine, New York, New York, USA
- Department of Biomedical Engineering, NYU Tandon School of Engineering, Brooklyn, New York, USA
- Ratchasuda Institute, Faculty of Medicine Ramathibodi Hospital, Mahidol University, Nakhon Pathom, Thailand
| |
Collapse
|
2
|
Lavric A, Beguni C, Zadobrischi E, Căilean AM, Avătămăniței SA. A Comprehensive Survey on Emerging Assistive Technologies for Visually Impaired Persons: Lighting the Path with Visible Light Communications and Artificial Intelligence Innovations. SENSORS (BASEL, SWITZERLAND) 2024; 24:4834. [PMID: 39123881 PMCID: PMC11314945 DOI: 10.3390/s24154834] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/04/2024] [Revised: 07/01/2024] [Accepted: 07/23/2024] [Indexed: 08/12/2024]
Abstract
In the context in which severe visual impairment significantly affects human life, this article emphasizes the potential of Artificial Intelligence (AI) and Visible Light Communications (VLC) in developing future assistive technologies. Toward this path, the article summarizes the features of some commercial assistance solutions, and debates the characteristics of VLC and AI, emphasizing their compatibility with blind individuals' needs. Additionally, this work highlights the AI potential in the efficient early detection of eye diseases. This article also reviews the existing work oriented toward VLC integration in blind persons' assistive applications, showing the existing progress and emphasizing the high potential associated with VLC use. In the end, this work provides a roadmap toward the development of an integrated AI-based VLC assistance solution for visually impaired people, pointing out the high potential and some of the steps to follow. As far as we know, this is the first comprehensive work which focuses on the integration of AI and VLC technologies in visually impaired persons' assistance domain.
Collapse
Affiliation(s)
- Alexandru Lavric
- Department of Computers, Electronics and Automation, Stefan cel Mare University of Suceava, 720229 Suceava, Romania; (A.L.); (C.B.); (E.Z.)
| | - Cătălin Beguni
- Department of Computers, Electronics and Automation, Stefan cel Mare University of Suceava, 720229 Suceava, Romania; (A.L.); (C.B.); (E.Z.)
- Integrated Center for Research, Development and Innovation in Advanced Materials, Nanotechnologies and Distributed Systems for Fabrication and Control, Stefan cel Mare University of Suceava, 720229 Suceava, Romania;
| | - Eduard Zadobrischi
- Department of Computers, Electronics and Automation, Stefan cel Mare University of Suceava, 720229 Suceava, Romania; (A.L.); (C.B.); (E.Z.)
- Integrated Center for Research, Development and Innovation in Advanced Materials, Nanotechnologies and Distributed Systems for Fabrication and Control, Stefan cel Mare University of Suceava, 720229 Suceava, Romania;
| | - Alin-Mihai Căilean
- Department of Computers, Electronics and Automation, Stefan cel Mare University of Suceava, 720229 Suceava, Romania; (A.L.); (C.B.); (E.Z.)
- Integrated Center for Research, Development and Innovation in Advanced Materials, Nanotechnologies and Distributed Systems for Fabrication and Control, Stefan cel Mare University of Suceava, 720229 Suceava, Romania;
| | - Sebastian-Andrei Avătămăniței
- Integrated Center for Research, Development and Innovation in Advanced Materials, Nanotechnologies and Distributed Systems for Fabrication and Control, Stefan cel Mare University of Suceava, 720229 Suceava, Romania;
- East European Border Scientific and Technological Park, 725500 Siret, Romania
| |
Collapse
|
3
|
Lăpușteanu A, Morar A, Moldoveanu A, Băluțoiu MA, Moldoveanu F. A review of sonification solutions in assistive systems for visually impaired people. Disabil Rehabil Assist Technol 2024:1-16. [PMID: 38469665 DOI: 10.1080/17483107.2024.2326590] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2023] [Accepted: 02/22/2024] [Indexed: 03/13/2024]
Abstract
PURPOSE Visually impaired people (VIP) find it challenging to understand and gain awareness of their surroundings. Most activities require the use of the auditory or tactile senses. As such, assistive systems which are capable of aiding visually impaired people to understand, navigate and form a mental representation of their environment are extensively being studied and developed. The aim of this paper is to provide insight regarding the characteristics, as well as the advantages and drawbacks of different types of sonification strategies in assistive systems, to assess their suitability for certain use-cases. MATERIALS AND METHODS To this end, we reviewed a sizeable number of assistive solutions for VIP which provide a form of auditory feedback to the user, encountered in different scientific databases (Scopus, IEEE Xplore, ACM and Google Scholar) through direct searches and cross-referencing. RESULTS We classified these solutions based on the aural information they provide to the VIP - alerts, guidance and information about their environment, be it spatial or semantic. Our intention is not to provide an exhaustive review, but to select representative implementations from recent literature that highlight the particularities of each sonification approach. CONCLUSIONS Thus, anyone who is intent on developing an assistive solution will be able to choose the desired sonification class, being aware of the advantages/disadvantages and at the same time having a fairly wide selection of articles from the representative class.
Collapse
Affiliation(s)
- Andrei Lăpușteanu
- Department of Computers, Faculty of Automatic Control and Computers, National University of Science and Technology Politehnica, Bucharest, Romania
| | - Anca Morar
- Department of Computers, Faculty of Automatic Control and Computers, National University of Science and Technology Politehnica, Bucharest, Romania
| | - Alin Moldoveanu
- Department of Computers, Faculty of Automatic Control and Computers, National University of Science and Technology Politehnica, Bucharest, Romania
| | - Maria-Anca Băluțoiu
- Department of Computers, Faculty of Automatic Control and Computers, National University of Science and Technology Politehnica, Bucharest, Romania
| | - Florica Moldoveanu
- Department of Computers, Faculty of Automatic Control and Computers, National University of Science and Technology Politehnica, Bucharest, Romania
| |
Collapse
|
4
|
Mai C, Chen H, Zeng L, Li Z, Liu G, Qiao Z, Qu Y, Li L, Li L. A Smart Cane Based on 2D LiDAR and RGB-D Camera Sensor-Realizing Navigation and Obstacle Recognition. SENSORS (BASEL, SWITZERLAND) 2024; 24:870. [PMID: 38339588 PMCID: PMC10856969 DOI: 10.3390/s24030870] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/04/2023] [Revised: 12/27/2023] [Accepted: 01/19/2024] [Indexed: 02/12/2024]
Abstract
In this paper, an intelligent blind guide system based on 2D LiDAR and RGB-D camera sensing is proposed, and the system is mounted on a smart cane. The intelligent guide system relies on 2D LiDAR, an RGB-D camera, IMU, GPS, Jetson nano B01, STM32, and other hardware. The main advantage of the intelligent guide system proposed by us is that the distance between the smart cane and obstacles can be measured by 2D LiDAR based on the cartographer algorithm, thus achieving simultaneous localization and mapping (SLAM). At the same time, through the improved YOLOv5 algorithm, pedestrians, vehicles, pedestrian crosswalks, traffic lights, warning posts, stone piers, tactile paving, and other objects in front of the visually impaired can be quickly and effectively identified. Laser SLAM and improved YOLOv5 obstacle identification tests were carried out inside a teaching building on the campus of Hainan Normal University and on a pedestrian crossing on Longkun South Road in Haikou City, Hainan Province. The results show that the intelligent guide system developed by us can drive the omnidirectional wheels at the bottom of the smart cane and provide the smart cane with a self-leading blind guide function, like a "guide dog", which can effectively guide the visually impaired to avoid obstacles and reach their predetermined destination, and can quickly and effectively identify the obstacles on the way out. The mapping and positioning accuracy of the system's laser SLAM is 1 m ± 7 cm, and the laser SLAM speed of this system is 25~31 FPS, which can realize the short-distance obstacle avoidance and navigation function both in indoor and outdoor environments. The improved YOLOv5 helps to identify 86 types of objects. The recognition rates for pedestrian crosswalks and for vehicles are 84.6% and 71.8%, respectively; the overall recognition rate for 86 types of objects is 61.2%, and the obstacle recognition rate of the intelligent guide system is 25-26 FPS.
Collapse
Affiliation(s)
- Chunming Mai
- College of Physics and Eletronic Engineering, Hainan Normal University, Haikou 571158, China; (C.M.); (L.Z.); (Z.L.); (G.L.); (Z.Q.); (Y.Q.)
| | - Huaze Chen
- College of Information Science and Technology, Hainan Normal University, Haikou 571158, China;
| | - Lina Zeng
- College of Physics and Eletronic Engineering, Hainan Normal University, Haikou 571158, China; (C.M.); (L.Z.); (Z.L.); (G.L.); (Z.Q.); (Y.Q.)
- Key Laboratory of Laser Technology and Optoelectronic Functional Materials of Hainan Province, Hainan Normal University, Haikou 571158, China
- Hainan International Joint Research Center for Semiconductor Lasers, Hainan Normal University, Haikou 571158, China;
| | - Zaijin Li
- College of Physics and Eletronic Engineering, Hainan Normal University, Haikou 571158, China; (C.M.); (L.Z.); (Z.L.); (G.L.); (Z.Q.); (Y.Q.)
- Key Laboratory of Laser Technology and Optoelectronic Functional Materials of Hainan Province, Hainan Normal University, Haikou 571158, China
- Hainan International Joint Research Center for Semiconductor Lasers, Hainan Normal University, Haikou 571158, China;
| | - Guojun Liu
- College of Physics and Eletronic Engineering, Hainan Normal University, Haikou 571158, China; (C.M.); (L.Z.); (Z.L.); (G.L.); (Z.Q.); (Y.Q.)
- Key Laboratory of Laser Technology and Optoelectronic Functional Materials of Hainan Province, Hainan Normal University, Haikou 571158, China
- Hainan International Joint Research Center for Semiconductor Lasers, Hainan Normal University, Haikou 571158, China;
| | - Zhongliang Qiao
- College of Physics and Eletronic Engineering, Hainan Normal University, Haikou 571158, China; (C.M.); (L.Z.); (Z.L.); (G.L.); (Z.Q.); (Y.Q.)
- Key Laboratory of Laser Technology and Optoelectronic Functional Materials of Hainan Province, Hainan Normal University, Haikou 571158, China
- Hainan International Joint Research Center for Semiconductor Lasers, Hainan Normal University, Haikou 571158, China;
| | - Yi Qu
- College of Physics and Eletronic Engineering, Hainan Normal University, Haikou 571158, China; (C.M.); (L.Z.); (Z.L.); (G.L.); (Z.Q.); (Y.Q.)
- Key Laboratory of Laser Technology and Optoelectronic Functional Materials of Hainan Province, Hainan Normal University, Haikou 571158, China
- Hainan International Joint Research Center for Semiconductor Lasers, Hainan Normal University, Haikou 571158, China;
| | - Lianhe Li
- Hainan International Joint Research Center for Semiconductor Lasers, Hainan Normal University, Haikou 571158, China;
| | - Lin Li
- College of Physics and Eletronic Engineering, Hainan Normal University, Haikou 571158, China; (C.M.); (L.Z.); (Z.L.); (G.L.); (Z.Q.); (Y.Q.)
- Key Laboratory of Laser Technology and Optoelectronic Functional Materials of Hainan Province, Hainan Normal University, Haikou 571158, China
- Hainan International Joint Research Center for Semiconductor Lasers, Hainan Normal University, Haikou 571158, China;
| |
Collapse
|
5
|
Zhang X, Pan Z, Song Z, Zhang Y, Li W, Ding S. The Aerial Guide Dog: A Low-Cognitive-Load Indoor Electronic Travel Aid for Visually Impaired Individuals. SENSORS (BASEL, SWITZERLAND) 2024; 24:297. [PMID: 38203159 PMCID: PMC10781224 DOI: 10.3390/s24010297] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/24/2023] [Revised: 12/22/2023] [Accepted: 01/02/2024] [Indexed: 01/12/2024]
Abstract
Most navigation aids for visually impaired individuals require users to pay close attention and actively understand the instructions or feedback of guidance, which impose considerable cognitive loads in long-term usage. To tackle the issue, this study proposes a cognitive burden-free electronic travel aid for individuals with visual impairments. Utilizing human instinctive compliance in response to external force, we introduce the "Aerial Guide Dog", a helium balloon aerostat drone designed for indoor guidance, which leverages gentle tugs in real time for directional guidance, ensuring a seamless and intuitive guiding experience. The introduced Aerial Guide Dog has been evaluated in terms of directional guidance and path following in the pilot study, focusing on assessing its accuracy in orientation and the overall performance in navigation. Preliminary results show that the Aerial Guide Dog, utilizing Ultra-Wideband (UWB) spatial positioning and Measurement Unit (IMU) angle sensors, consistently maintained minimal deviation from the targeting direction and designated path, while imposing negligible cognitive burdens on users while completing the guidance tasks.
Collapse
Affiliation(s)
| | | | | | | | | | - Shiyao Ding
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510090, China; (X.Z.); (Z.P.); (Z.S.); (Y.Z.); (W.L.)
| |
Collapse
|
6
|
Bakouri M, Alyami N, Alassaf A, Waly M, Alqahtani T, AlMohimeed I, Alqahtani A, Samsuzzaman M, Ismail HF, Alharbi Y. Sound-Based Localization Using LSTM Networks for Visually Impaired Navigation. SENSORS (BASEL, SWITZERLAND) 2023; 23:4033. [PMID: 37112374 PMCID: PMC10145617 DOI: 10.3390/s23084033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/19/2023] [Revised: 04/04/2023] [Accepted: 04/14/2023] [Indexed: 06/19/2023]
Abstract
In this work, we developed a prototype that adopted sound-based systems for localization of visually impaired individuals. The system was implemented based on a wireless ultrasound network, which helped the blind and visually impaired to navigate and maneuver autonomously. Ultrasonic-based systems use high-frequency sound waves to detect obstacles in the environment and provide location information to the user. Voice recognition and long short-term memory (LSTM) techniques were used to design the algorithms. The Dijkstra algorithm was also used to determine the shortest distance between two places. Assistive hardware tools, which included an ultrasonic sensor network, a global positioning system (GPS), and a digital compass, were utilized to implement this method. For indoor evaluation, three nodes were localized on the doors of different rooms inside the house, including the kitchen, bathroom, and bedroom. The coordinates (interactive latitude and longitude points) of four outdoor areas (mosque, laundry, supermarket, and home) were identified and stored in a microcomputer's memory to evaluate the outdoor settings. The results showed that the root mean square error for indoor settings after 45 trials is about 0.192. In addition, the Dijkstra algorithm determined that the shortest distance between two places was within an accuracy of 97%.
Collapse
Affiliation(s)
- Mohsen Bakouri
- Department of Medical Equipment Technology, College of Applied Medical Science, Majmaah University, Al-Majmaah 11952, Saudi Arabia
- Department of Physics, College of Arts, Fezzan University, Traghen 71340, Libya
| | - Naif Alyami
- Department of Medical Equipment Technology, College of Applied Medical Science, Majmaah University, Al-Majmaah 11952, Saudi Arabia
| | - Ahmad Alassaf
- Department of Medical Equipment Technology, College of Applied Medical Science, Majmaah University, Al-Majmaah 11952, Saudi Arabia
| | - Mohamed Waly
- Department of Medical Equipment Technology, College of Applied Medical Science, Majmaah University, Al-Majmaah 11952, Saudi Arabia
| | - Tariq Alqahtani
- Department of Medical Equipment Technology, College of Applied Medical Science, Majmaah University, Al-Majmaah 11952, Saudi Arabia
| | - Ibrahim AlMohimeed
- Department of Medical Equipment Technology, College of Applied Medical Science, Majmaah University, Al-Majmaah 11952, Saudi Arabia
| | - Abdulrahman Alqahtani
- Department of Medical Equipment Technology, College of Applied Medical Science, Majmaah University, Al-Majmaah 11952, Saudi Arabia
- Department of Biomedical Technology, College of Applied Medical Sciences in Al-Kharj, Prince Sattam Bin Abdulaziz University, Al-Kharj 11942, Saudi Arabia
| | - Md Samsuzzaman
- Department of Computer and Communication Engineering, Faculty of Computer Science and Engineering, Patuakhali Science and Technology, Patuakhali 6800, Bangladesh
| | - Husham Farouk Ismail
- Department of Biomedical Equipment Technology, Inaya Medical College, Riyadh 13541, Saudi Arabia
| | - Yousef Alharbi
- Department of Biomedical Technology, College of Applied Medical Sciences in Al-Kharj, Prince Sattam Bin Abdulaziz University, Al-Kharj 11942, Saudi Arabia
| |
Collapse
|
7
|
Wei L, Jin L, Gong R, Yang Y, Zhang X. Design of Audio-Augmented-Reality-Based O&M Orientation Training for Visually Impaired Children. SENSORS (BASEL, SWITZERLAND) 2022; 22:9487. [PMID: 36502192 PMCID: PMC9741135 DOI: 10.3390/s22239487] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Revised: 11/25/2022] [Accepted: 12/02/2022] [Indexed: 06/17/2023]
Abstract
Orientation and Mobility training (O&M) is a specific program that teaches people with vision loss to orient themselves and travel safely within certain contexts. State-of-the-art research reveals that people with vision loss expect high-quality O&M training, especially at early ages, but the conventional O&M training methods involve tedious programs and require a high participation of professional trainers. However, there is an insufficient number of excellent trainers. In this work, we first interpret and discuss the relevant research in recent years. Then, we discuss the questionnaires and interviews we conducted with visually impaired people. On the basis of field investigation and related research, we propose the design of a training solution for children to operate and maintain direction based on audio augmented reality. We discuss how, within the perceptible scene created by EasyAR's map-aware framework, we created an AR audio source tracing training that simulates a social scene to strengthen the audiometric identification of the subjects, and then to verify the efficiency and feasibility of this scheme, we implemented the application prototype with the required hardware and software and conducted the subsequential experiments with blindfolded children. We confirm the high usability of the designed approach by analyzing the results of the pilot study. Compared with other orientation training studies, the method we propose makes the whole training process flexible and entertaining. At the same time, this training process does not involve excessive economic costs or require professional skills training, allowing users to undergo training at home or on the sports ground rather than having to go to rehabilitation sites or specified schools. Furthermore, according to the feedback from the experiments, the approach is promising in regard to gamification.
Collapse
|
8
|
Balconi M, Acconito C, Angioletti L. Emotional Effects in Object Recognition by the Visually Impaired People in Grocery Shopping. SENSORS (BASEL, SWITZERLAND) 2022; 22:8442. [PMID: 36366140 PMCID: PMC9654971 DOI: 10.3390/s22218442] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/08/2022] [Revised: 10/25/2022] [Accepted: 10/29/2022] [Indexed: 06/16/2023]
Abstract
To date, neuroscientific literature on consumption patterns of specific categories of consumers, such as people with disability, is still scarce. This study explored the implicit emotional consumer experience of visually impaired (VI) consumers in-store. A group of VI and a control group explored three different product shelves and manipulated target products during a real supermarket shopping experience. Autonomic (SCL, skin conductance level; SCR, skin conductance response; HR, heart rate; PVA, pulse volume amplitude; BVP, blood volume pulse), behavioural and self-report data were collected in relation to three phases of the in-store shopping experience: (i) identification of a product (recognition accuracy, ACC, and reaction times, RTs); (ii) style of product purchase (predominant sense used for shelf exploration, store spatial representation, and ability to orientate themselves); (iii) consumers experience itself, underlying their emotional experience. In the VI group, higher levels of disorientation, difficulty in finding products, and repeating the route independently were discovered. ACC and RTs also varied by product type. VI also showed significantly higher PVA values compared to the control. For some specific categories (pasta category), PVA correlates negatively with time to recognition and positively with simplicity in finding products in the entire sample. In conclusion, VI emotional and cognitive experience of grocery shopping as stressful and frustrating and has a greater cognitive investment, which is mirrored by the activation of a larger autonomic response compared to the control group. Nevertheless, VI ability to search and recognise a specific product is not so different from people without visual impairment.
Collapse
Affiliation(s)
- Michela Balconi
- International Research Center for Cognitive Applied Neuroscience (IrcCAN), Catholic University of the Sacred Heart, Largo Gemelli, 1, 20123 Milan, Italy
- Research Unit in Affective and Social Neuroscience, Department of Psychology, Catholic University of the Sacred Heart, Largo Gemelli, 1, 20123 Milan, Italy
| | - Carlotta Acconito
- International Research Center for Cognitive Applied Neuroscience (IrcCAN), Catholic University of the Sacred Heart, Largo Gemelli, 1, 20123 Milan, Italy
- Research Unit in Affective and Social Neuroscience, Department of Psychology, Catholic University of the Sacred Heart, Largo Gemelli, 1, 20123 Milan, Italy
| | - Laura Angioletti
- International Research Center for Cognitive Applied Neuroscience (IrcCAN), Catholic University of the Sacred Heart, Largo Gemelli, 1, 20123 Milan, Italy
- Research Unit in Affective and Social Neuroscience, Department of Psychology, Catholic University of the Sacred Heart, Largo Gemelli, 1, 20123 Milan, Italy
| |
Collapse
|
9
|
Messaoudi MD, Menelas BAJ, Mcheick H. Review of Navigation Assistive Tools and Technologies for the Visually Impaired. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22207888. [PMID: 36298237 PMCID: PMC9606951 DOI: 10.3390/s22207888] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/26/2022] [Revised: 09/28/2022] [Accepted: 10/10/2022] [Indexed: 05/14/2023]
Abstract
The visually impaired suffer greatly while moving from one place to another. They face challenges in going outdoors and in protecting themselves from moving and stationary objects, and they also lack confidence due to restricted mobility. Due to the recent rapid rise in the number of visually impaired persons, the development of assistive devices has emerged as a significant research field. This review study introduces several techniques to help the visually impaired with their mobility and presents the state-of-the-art of recent assistive technologies that facilitate their everyday life. It also analyses comprehensive multiple mobility assistive technologies for indoor and outdoor environments and describes the different location and feedback methods for the visually impaired using assistive tools based on recent technologies. The navigation tools used for the visually impaired are discussed in detail in subsequent sections. Finally, a detailed analysis of various methods is also carried out, with future recommendations.
Collapse
|
10
|
Design and Implementation of an Intelligent Assistive Cane for Visually Impaired People Based on an Edge-Cloud Collaboration Scheme. ELECTRONICS 2022. [DOI: 10.3390/electronics11142266] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Visually impaired people face many inconveniences in daily life, and there are problems such as high prices and single functions in the market of assistance tools for visually impaired people. In this work, we designed and implemented a low-cost intelligent assistance cane, particularly for visually impaired individuals, based on computer vision, sensors, and an edge-cloud collaboration scheme. Obstacle detection, fall detection, and traffic light detection functions have been designed and integrated for the convenience of moving for visually impaired people. We have also designed an image captioning function and object detection function with high-speed processing capability based on an edge-cloud collaboration scheme to improve the user experience. Experiments show that the performance metrics have an aerial obstacle detection accuracy of 92.5%, fall detection accuracy of 90%, and average image retrieval period of 1.124 s. It proves the characteristics of low power consumption, strong real-time performance, adaptability to multiple scenarios, and convenience, which can ensure the safety of visually impaired people when moving and can help them better perceive and understand the surrounding environment.
Collapse
|
11
|
Abstract
Guidance systems for visually impaired persons have become a popular topic in recent years. Existing guidance systems on the market typically utilize auxiliary tools and methods such as GPS, UWB, or a simple white cane that exploits the user’s single tactile or auditory sense. These guidance methodologies can be inadequate in a complex indoor environment. This paper proposes a multi-sensory guidance system for the visually impaired that can provide tactile and auditory advice using ORB-SLAM and YOLO techniques. Based on an RGB-D camera, the local obstacle avoidance system is realized at the tactile level through point cloud filtering that can inform the user via a vibrating motor. Our proposed method can generate a dense navigation map to implement global obstacle avoidance and path planning for the user through the coordinate transformation. Real-time target detection and a voice-prompt system based on YOLO are also incorporated at the auditory level. We implemented the proposed system as a smart cane. Experiments are performed using four different test scenarios. Experimental results demonstrate that the impediments in the walking path can be reliably located and classified in real-time. Our proposed system can function as a capable auxiliary to help visually impaired people navigate securely by integrating YOLO with ORB-SLAM.
Collapse
|
12
|
Mueen A, Awedh M, Zafar B. Multi-obstacle aware smart navigation system for visually impaired people in fog connected IoT-cloud environment. Health Informatics J 2022; 28:14604582221112609. [PMID: 35801559 DOI: 10.1177/14604582221112609] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Design of smart navigation for visually impaired/blind people is a hindering task. Existing researchers analyzed it in either indoor or outdoor environment and also it's failed to focus on optimum route selection, latency minimization and multi-obstacle presence. In order to overcome these challenges and to provide precise assistance to visually impaired people, this paper proposes smart navigation system for visually impaired people based on both image and sensor outputs of the smart wearable. The proposed approach involves the upcoming processes: (i) the input query of the visually impaired people (users) is improved by the query processor in order to achieve accurate assistance. (ii) The safest route from source to destination is provided by implementing Environment aware Bald Eagle Search Optimization algorithm in which multiple routes are identified and classified into three different classes from which the safest route is suggested to the users. (iii) The concept of fog computing is leveraged and the optimal fog node is selected in order to minimize the latency. The fog node selection is executed by using Nearest Grey Absolute Decision Making Algorithm based on multiple parameters. (iv) The retrieval of relevant information is performed by means of computing Euclidean distance between the reference and database information. (v) The multi-obstacle detection is carried out by YOLOv3 Tiny in which both the static and dynamic obstacles are classified into small, medium and large obstacles. (vi) The decision upon navigation is provided by implementing Adaptive Asynchronous Advantage Actor-Critic (A3C) algorithm based on fusion of both image and sensor outputs. (vii) Management of heterogeneous is carried out by predicting and pruning the fault data in the sensor output by minimum distance based extended kalman filter for better accuracy and clustering the similar information by implementing Spatial-Temporal Optics Clustering Algorithm to reduce complexity. The proposed model is implemented in NS 3.26 and the results proved that it outperforms other existing works in terms of obstacle detection and task completion time.
Collapse
Affiliation(s)
- Ahmed Mueen
- Faculty of Applied Studies, 37848King Abdulaziz University, Jeddah, Saudi Arabia
| | - Mohammad Awedh
- Faculty of Engineering, 37848King Abdulaziz University, Jeddah, Saudi Arabia
| | - Bassam Zafar
- Faculty of Computer and Information Technology, 37848King Abdulaziz University, Jeddah, Saudi Arabia
| |
Collapse
|
13
|
Kim K, Kim S, Choi A. Ultrasonic Sound Guide System with Eyeglass Device for the Visually Impaired. SENSORS 2022; 22:s22083077. [PMID: 35459062 PMCID: PMC9030799 DOI: 10.3390/s22083077] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Revised: 03/29/2022] [Accepted: 04/14/2022] [Indexed: 11/25/2022]
Abstract
The ultrasonic sound guide system presents the audio broadcasting system based on the inaudible ultrasonic sound to assist the indoor and outdoor navigation of the visually impaired. The transmitters are placed at the point of interest to propagate the frequency modulated voice signal in ultrasonic sound range. The dual channel receiver device is carried by the visually impaired person in the form of eyeglasses to receive the ultrasonic sound for the voice signal via demodulation. Since the ultrasonic sound demonstrates the acoustic properties, the velocity, directivity, attenuation, and superposition of ultrasonic sound provide the acoustic clue to the user for localizing the multiple transmitter positions by binaural localization capability. The visually impaired hear the designated voice signal and follow the signal attributions to arrive at the specific location. Due to the low microphone gain from side addressing, the time delay between the receiver channels demonstrates the high variance and high bias in end directions. However, the perception experiment shows the further prediction accuracy in end directions as compared to the center direction outcomes. The overall evaluations show the precise directional prediction for narrow- and wide-angle situations. The ultrasonic sound guide system is a useful device to localize places in the near field without touching braille.
Collapse
|
14
|
A Survey on Recent Advances in AI and Vision-Based Methods for Helping and Guiding Visually Impaired People. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12052308] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
We present in this paper the state of the art and an analysis of recent research work and achievements performed in the domain of AI-based and vision-based systems for helping blind and visually impaired people (BVIP). We start by highlighting the recent and tremendous importance that AI has acquired following the use of convolutional neural networks (CNN) and their ability to solve image classification tasks efficiently. After that, we also note that VIP have high expectations about AI-based systems as a possible way to ease the perception of their environment and to improve their everyday life. Then, we set the scope of our survey: we concentrate our investigations on the use of CNN or related methods in a vision-based system for helping BVIP. We analyze the existing surveys, and we study the current work (a selection of 30 case studies) using several dimensions such as acquired data, learned models, and human–computer interfaces. We compare the different approaches, and conclude by analyzing future trends in this domain.
Collapse
|
15
|
Indoor-Guided Navigation for People Who Are Blind: Crowdsourcing for Route Mapping and Assistance. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12010523] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
This paper presents an approach to enhance electronic traveling aids (ETAs) for people who are blind and severely visually impaired (BSVI) using indoor orientation and guided navigation by employing social outsourcing of indoor route mapping and assistance processes. This type of approach is necessary because GPS does not work well, and infrastructural investments are absent or too costly to install for indoor navigation. Our approach proposes the prior outsourcing of vision-based recordings of indoor routes from an online network of seeing volunteers, who gather and constantly update a web cloud database of indoor routes using specialized sensory equipment and web services. Computational intelligence-based algorithms process sensory data and prepare them for BSVI usage. In this way, people who are BSVI can obtain ready-to-use access to the indoor routes database. This type of service has not previously been offered in such a setting. Specialized wearable sensory ETA equipment, depth cameras, smartphones, computer vision algorithms, tactile and audio interfaces, and computational intelligence algorithms are employed for that matter. The integration of semantic data of points of interest (such as stairs, doors, WC, entrances/exits) and evacuation schemes could make the proposed approach even more attractive to BVSI users. Presented approach crowdsources volunteers’ real-time online help for complex navigational situations using a mobile app, a live video stream from BSVI wearable cameras, and digitalized maps of buildings’ evacuation schemes.
Collapse
|
16
|
Kahraman M, Turhan C. An intelligent indoor guidance and navigation system for the visually impaired. Assist Technol 2021; 34:478-486. [PMID: 33465017 DOI: 10.1080/10400435.2021.1872738] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022] Open
Abstract
Intelligent guidance in complex environments where various procedures are required for navigation is critical to achieving mobility for the visually impaired. This study presents a newly developed software prototype with a hybrid RFID/BLE infrastructure to provide intelligent navigation and guidance to the visually impaired in complex indoor environments. The system enables the users to input their purpose via a specially designed user interface, and provides intelligent guidance through a chain of destination targets which are determined according to the inherent procedures of the environment. Path optimization is performed by adaptation of the traveling salesman problem, and real-time instantaneous instructions are provided to guide the users through the predetermined destination points. For evaluation purposes, a hospital environment is constructed as an example of a complex environment and the system is tested by visually impaired participants. The results show that the intelligent purpose selection and destination evaluation mechanism modules of the system are found to be effective by all the participants.
Collapse
Affiliation(s)
- M Kahraman
- Department of Software Engineering, Atılım University, İncek/Ankara, Turkey
| | - C Turhan
- Department of Software Engineering, Atılım University, İncek/Ankara, Turkey
| |
Collapse
|
17
|
Simões WCSS, Machado GS, Sales AMA, de Lucena MM, Jazdi N, de Lucena VF. A Review of Technologies and Techniques for Indoor Navigation Systems for the Visually Impaired. SENSORS (BASEL, SWITZERLAND) 2020; 20:E3935. [PMID: 32679720 PMCID: PMC7411868 DOI: 10.3390/s20143935] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/04/2020] [Revised: 06/30/2020] [Accepted: 07/01/2020] [Indexed: 11/18/2022]
Abstract
Technologies and techniques of location and navigation are advancing, allowing greater precision in locating people in complex and challenging conditions. These advances have attracted growing interest from the scientific community in using indoor positioning systems (IPSs) with a higher degree of precision and fast delivery time, for groups of people such as the visually impaired, to some extent improving their quality of life. Much research brings together various works that deal with the physical and logical approaches of IPSs to give the reader a more general view of the models. These surveys, however, need to be continuously revisited to update the literature on the features described. This paper presents an expansion of the range of technologies and methodologies for assisting the visually impaired in previous works, providing readers and researchers with a more recent version of what was done and the advantages and disadvantages of each approach to guide reviews and discussions about these topics. Finally, we discuss a series of considerations and future trends for the construction of indoor navigation and location systems for the visually impaired.
Collapse
Affiliation(s)
- Walter C. S. S. Simões
- PPGI/ICOMP—Programa de Pós-Graduação em Informática, Institute of Computing, UFAM—Federal University of Amazonas, Manaus, AM 69080-900, Brazil;
| | - Guido S. Machado
- PPGEE—Programa de Pós-Graduação em Engenharia, Technology College, UFAM—Federal University of Amazonas, Manaus, AM 69080-900, Brazil; (G.S.M.); (A.M.A.S.)
| | - André M. A. Sales
- PPGEE—Programa de Pós-Graduação em Engenharia, Technology College, UFAM—Federal University of Amazonas, Manaus, AM 69080-900, Brazil; (G.S.M.); (A.M.A.S.)
| | - Mateus M. de Lucena
- Software/Hardware Integration Lab, UFSC—Federal University of Santa Catarina, Florianópolis, SC 88040-900, Brazil;
| | - Nasser Jazdi
- Institute of Industrial Automation and Software Systems, The University of Stuttgart, 70550 Stuttgart, Germany;
| | - Vicente F. de Lucena
- PPGI/ICOMP—Programa de Pós-Graduação em Informática, Institute of Computing, UFAM—Federal University of Amazonas, Manaus, AM 69080-900, Brazil;
- PPGEE—Programa de Pós-Graduação em Engenharia, Technology College, UFAM—Federal University of Amazonas, Manaus, AM 69080-900, Brazil; (G.S.M.); (A.M.A.S.)
- CETELI–Sector North of UFAM’s Main Campus, UFAM—Federal University of Amazonas, Manaus, AM 69080-900, Brazil
| |
Collapse
|
18
|
Budrionis A, Plikynas D, Daniušis P, Indrulionis A. Smartphone-based computer vision travelling aids for blind and visually impaired individuals: A systematic review. Assist Technol 2020; 34:178-194. [PMID: 32207640 DOI: 10.1080/10400435.2020.1743381] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023] Open
Abstract
Given the growth in the numbers of visually impaired (VI) people in low-income countries, the development of affordable electronic travel aid (ETA) systems employing devices, sensors, and apps embedded in ordinary smartphones becomes a potentially cost-effective and reasonable all-in-one solution of utmost importance for the VI. This paper offers an overview of recent ETA research prototypes that employ smartphones for assisted orientation and navigation in indoor and outdoor spaces by providing additional information about the surrounding objects. Scientific achievements in the field were systematically reviewed using PRISMA methodology. Comparative meta-analysis showed how various smartphone-based ETA prototypes could assist with better orientation, navigation, and wayfinding in indoor and outdoor environments. The analysis found limited interest among researchers in combining haptic interfaces and computer vision capabilities in smartphone-based ETAs for the blind, few attempts to employ novel state-of-the-art computer vision methods based on deep neural networks, and no evaluations of existing off-the-shelf navigation solutions. These results were contrasted with findings from a survey of blind expert users on their problems in navigating in indoor and outdoor environments. This revealed a major mismatch between user needs and academic development in the field.
Collapse
Affiliation(s)
- Andrius Budrionis
- Department of Business Technologies and Entrepreneurship, Vilnius Gediminas Technical University, Vilnius, Lithuania.,Norwegian Centre for E-health Research, University Hospital of North Norway, Tromsø, Norway
| | - Darius Plikynas
- Department of Business Technologies and Entrepreneurship, Vilnius Gediminas Technical University, Vilnius, Lithuania
| | - Povilas Daniušis
- Department of Business Technologies and Entrepreneurship, Vilnius Gediminas Technical University, Vilnius, Lithuania
| | - Audrius Indrulionis
- Department of Business Technologies and Entrepreneurship, Vilnius Gediminas Technical University, Vilnius, Lithuania
| |
Collapse
|