1
|
Abidi MH, Noor Siddiquee A, Alkhalefah H, Srivastava V. A comprehensive review of navigation systems for visually impaired individuals. Heliyon 2024; 10:e31825. [PMID: 38841448 PMCID: PMC11152936 DOI: 10.1016/j.heliyon.2024.e31825] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2024] [Revised: 04/08/2024] [Accepted: 05/22/2024] [Indexed: 06/07/2024] Open
Abstract
Background This review explores the evolutionary trajectory of navigation assistance tools tailored for the visually impaired, spanning from traditional aids like white canes to contemporary electronic devices. It underlines their pivotal role in fostering safe mobility for visually impaired individuals. Objectives The primary aim is to categorize and assess the plethora of navigation assistance solutions available. Emphasis is placed on technological advancements, particularly in electronic systems employing sensors, AI, and feedback mechanisms. Furthermore, the review underscores the emerging influence of smartphone-based solutions and navigation satellite systems in augmenting independence and quality of life for the visually impaired. Methods Navigation assistance solutions are segmented into four key categories: Visual Imagery Systems, Non-Visual Data Systems, Map-Based Solutions, and 3D Sound Systems. The integration of diverse sensors like Ultrasonic Sensors and LiDAR for obstacle detection and real-time feedback is scrutinized. Additionally, the fusion of smartphone technology with sensors to deliver location-based assistance is explored. The review also evaluates the functionality, efficacy, and cost-efficiency of navigation satellite systems. Results Results indicate a significant evolution in navigation aids, with modern electronic systems proving highly effective in aiding obstacle detection and safe navigation. The convenience and portability of smartphone-based solutions are underscored, along with the potential of navigation satellite systems to enhance navigation assistance. Conclusions In conclusion, the review advocates for continued innovation and technological integration in navigation tools to empower visually impaired individuals with increased independence and safe access to their surroundings. It accentuates the imperative of ongoing efforts to enhance the quality of life for those with visual impairments through futuristic technological solutions.
Collapse
Affiliation(s)
- Mustufa Haider Abidi
- Advanced Manufacturing Institute, King Saud University, Riyadh, 11421, Saudi Arabia
- King Salman Center for Disability Research, Riyadh, 11614, Saudi Arabia
| | - Arshad Noor Siddiquee
- King Salman Center for Disability Research, Riyadh, 11614, Saudi Arabia
- Department of Mechanical Engineering, Jamia Millia Islamia, New Delhi, 110025, India
| | - Hisham Alkhalefah
- Advanced Manufacturing Institute, King Saud University, Riyadh, 11421, Saudi Arabia
- King Salman Center for Disability Research, Riyadh, 11614, Saudi Arabia
| | - Vishwaraj Srivastava
- King Salman Center for Disability Research, Riyadh, 11614, Saudi Arabia
- National Centre for Flexible Electronics, Indian Institute of Technology-Kanpur, India
| |
Collapse
|
2
|
Gao B, Shao T, Tu H, Ma Q, Liu Z, Han T. Exploring Bimanual Haptic Feedback for Spatial Search in Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:2422-2433. [PMID: 38437136 DOI: 10.1109/tvcg.2024.3372045] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2024]
Abstract
Spatial search tasks are common and crucial in many Virtual Reality (VR) applications. Traditional methods to enhance the performance of spatial search often employ sensory cues such as visual, auditory, or haptic feedback. However, the design and use of bimanual haptic feedback with two VR controllers for spatial search in VR remains largely unexplored. In this work, we explored bimanual haptic feedback with various combinations of haptic properties, where four types of bimanual haptic feedback were designed, for spatial search tasks in VR. Two experiments were designed to evaluate the effectiveness of bimanual haptic feedback on spatial direction guidance and search in VR. The results from the first experiment reveal that our proposed bimanual haptic schemes significantly enhanced the recognition of spatial directions in terms of accuracy and speed compared to spatial audio feedback. The second experiment's findings suggest that the performance of bimanual haptic feedback was comparable to or even better than the visual arrow, especially in reducing the angle of head movement and enhancing searching targets behind the participants, which was supported by subjective feedback as well. Based on these findings, we have derived a set of design recommendations for spatial search using bimanual haptic feedback in VR.
Collapse
|
3
|
Lian Y, Liu DE, Ji WZ. Survey and analysis of the current status of research in the field of outdoor navigation for the blind. Disabil Rehabil Assist Technol 2024; 19:1657-1675. [PMID: 37402242 DOI: 10.1080/17483107.2023.2227224] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Revised: 03/14/2023] [Accepted: 06/13/2023] [Indexed: 07/06/2023]
Abstract
PURPOSE In this article, we comprehensively review the current situation and research on technology related to outdoor travel for blind and visually impaired people (BVIP), given the diverse types and incomplete functionality of navigation aids for the blind. This aims to provide a reference for related research in the fields of outdoor travel for BVIP and blind navigation. MATERIALS AND METHODS We compiled articles related to blind navigation, of which a total of 227 of them are included in the search criteria. One hundred and seventy-nine articles are selected from the initial set, from a technical point of view, to elaborate on five aspects of blind navigation: system equipment, data sources, guidance algorithms, optimization of related methods, and navigation maps. RESULTS The wearable form of assistive devices for the blind has the most research, followed by the handheld type of aids. The RGB data class based on vision sensor is the most common source of navigation environment information data. Object detection based on picture data is also particularly rich among navigation algorithms and associated methods, indicating that computer vision technology has become an important study content in the field of blind navigation. However, research on navigation maps is relatively less. CONCLUSIONS In the study and development of assistive equipment for BVIP, there will be an emphasis on prioritizing attributes, such as lightness, portability, and efficiency. In light of the upcoming driverless era, the research focus will be on the development of visual sensors and computer vision technologies that can aid in navigation for the blind.
Collapse
Affiliation(s)
- Yue Lian
- School of Civil Engineering and Mapping and Engineering, Jiangxi University of Technology, Ganzhou, Jiangxi, China
| | - De-Er Liu
- School of Civil Engineering and Mapping and Engineering, Jiangxi University of Technology, Ganzhou, Jiangxi, China
| | - Wei-Zhen Ji
- State Key Laboratory of Remote Sensing Science, Beijing Normal University, Beijing, China
| |
Collapse
|
4
|
Scalvini F, Bordeau C, Ambard M, Migniot C, Dubois J. Outdoor Navigation Assistive System Based on Robust and Real-Time Visual-Auditory Substitution Approach. SENSORS (BASEL, SWITZERLAND) 2023; 24:166. [PMID: 38203027 PMCID: PMC10781372 DOI: 10.3390/s24010166] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/28/2023] [Revised: 12/20/2023] [Accepted: 12/25/2023] [Indexed: 01/12/2024]
Abstract
Blindness affects millions of people worldwide, leading to difficulties in daily travel and a loss of independence due to a lack of spatial information. This article proposes a new navigation aid to help people with severe blindness reach their destination. Blind people are guided by a short 3D spatialised sound that indicates the target point to follow. This sound is combined with other sonified information on potential obstacles in the vicinity. The proposed system is based on inertial sensors, GPS data, and the cartographic knowledge of pedestrian paths to define the trajectory. In addition, visual clues are used to refine the trajectory with ground floor information and obstacle information using a camera to provide 3D spatial information. The proposed method is based on a deep learning approach. The different neural networks used in this approach are evaluated on datasets that regroup navigations from pedestrians' point-of-view. This method achieves low latency and real-time processing without relying on remote connections, instead using a low-power embedded GPU target and a multithreaded approach for video processing, sound generation, and acquisition. This system could significantly improve the quality of life and autonomy of blind people, allowing them to reliably and efficiently navigate in their environment.
Collapse
Affiliation(s)
- Florian Scalvini
- Laboratory ImViA EA 7535, Université de Bourgogne, 21078 Dijon, France; (C.M.); (J.D.)
| | - Camille Bordeau
- LEAD, CNRS UMR 5022, Université de Bourgogne, 21078 Dijon, France; (C.B.); (M.A.)
| | - Maxime Ambard
- LEAD, CNRS UMR 5022, Université de Bourgogne, 21078 Dijon, France; (C.B.); (M.A.)
| | - Cyrille Migniot
- Laboratory ImViA EA 7535, Université de Bourgogne, 21078 Dijon, France; (C.M.); (J.D.)
| | - Julien Dubois
- Laboratory ImViA EA 7535, Université de Bourgogne, 21078 Dijon, France; (C.M.); (J.D.)
| |
Collapse
|
5
|
Wersényi G. Perception Accuracy of a Multi-Channel Tactile Feedback System for Assistive Technology. SENSORS (BASEL, SWITZERLAND) 2022; 22:8962. [PMID: 36433558 PMCID: PMC9695395 DOI: 10.3390/s22228962] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/20/2022] [Revised: 11/11/2022] [Accepted: 11/15/2022] [Indexed: 06/16/2023]
Abstract
Assistive technology uses multi-modal feedback devices, focusing on the visual, auditory, and haptic modalities. Tactile devices provide additional information via touch sense. Perception accuracy of vibrations depends on the spectral and temporal attributes of the signal, as well as on the body parts they are attached to. The widespread use of AR/VR devices, wearables, and gaming interfaces requires information about the usability of feedback devices. This paper presents results of an experiment using an 8-channel tactile feedback system with vibrators placed on the wrists, arms, ankles, and forehead. Different vibration patterns were designed and presented using sinusoidal frequency bursts on 2, 4, and 8 channels. In total, 27 subjects reported their sensation formally and informally on questionnaires. Results indicate that 2 and 4 channels could be used simultaneously with high accuracy, and the transducers' optimal placement (best sensitivity) is on the wrists, followed by the ankles. Arm and head positions were inferior and generally inadequate for signal presentation. For optimal performance, signal length should exceed 500 ms. Furthermore, the amplitude level and temporal pattern of the presented signals have to be used for carrying information rather than the frequency of the vibration.
Collapse
Affiliation(s)
- György Wersényi
- Department of Telecommunications, Széchenyi István University, H-9026 Gyor, Hungary
| |
Collapse
|
6
|
E-Textiles for Sports: A Systematic Review. JOURNAL OF BIOMIMETICS BIOMATERIALS AND BIOMEDICAL ENGINEERING 2022. [DOI: 10.4028/p-e03md3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
Abstract
This work presents a systematic review to provide an overview of the possibilities for coupling, fabrication or embedding of electronics into textiles whilst assuring the capability of these products to meet the requirements of a sports modality. The development of smart wearables systems for sports based on textiles attracts more and more users – motivated by design, by technology, as well as by the expectation of increased performance. A bibliographic search was carried out using the following databases: Scopus, Web of Science, IEEE Xplore and Science Direct. This study includes 32 articles and discusses these in a new taxonomy with three dimensions: measured variable, types of feedback and applications. Of the 23 technologies surveyed, this review showed that these wearable systems are mainly used for vital signs monitoring and to provide feedback on the electrical activity of the heart, with sensors mostly placed in the chest. Usually, the technologies are externally attachable rather than embedded in the textile. We observed that the implementation of design as the process of development of e-textile products is still only scarcely present in these studies.
Collapse
|
7
|
Hersh M. Wearable Travel Aids for Blind and Partially Sighted People: A Review with a Focus on Design Issues. SENSORS (BASEL, SWITZERLAND) 2022; 22:5454. [PMID: 35891128 PMCID: PMC9324285 DOI: 10.3390/s22145454] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/07/2022] [Revised: 02/01/2022] [Accepted: 07/17/2022] [Indexed: 06/15/2023]
Abstract
The ability to travel (independently) is very important for participation in education, work, leisure activities, and all other aspects of modern life. Blind and partially sighted people experience a number of barriers to travel, including inaccessible information and environments, and consequently require support from technology or other people to overcome them. Despite the potential of advanced technologies and the development of electronic travel aids, the long cane and guide dog remains the most commonly used solutions. Wearable technologies are becoming increasingly popular. They have the particular advantage of keeping the hands free, thereby facilitating the use of a long cane, guide dog or another device at the same time. They also have the potential to change the ways in which users interact with the environment. The main contributions of this paper are surveying the current state-of-the-art of travel aids from a design perspective and investigating the following issues: (1) The important design issues in wearable travel aids and the extent to which they are taken into account in different devices; (2) The relationship, if any, between where and how travel aids are worn and their design, features and functions; (3) Limitations of existing devices, gaps in provision and future research directions, particularly with regard to meeting potential users' needs.
Collapse
Affiliation(s)
- Marion Hersh
- Biomedical Engineering, University of Glasgow, Glasgow G12 8QQ, Scotland, UK
| |
Collapse
|
8
|
An Extended Usability and UX Evaluation of a Mobile Application for the Navigation of Individuals with Blindness and Visual Impairments Outdoors-An Evaluation Framework Based on Training. SENSORS 2022; 22:s22124538. [PMID: 35746320 PMCID: PMC9227192 DOI: 10.3390/s22124538] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Revised: 06/13/2022] [Accepted: 06/14/2022] [Indexed: 12/07/2022]
Abstract
Navigation assistive technologies have been designed to support the mobility of people who are blind and visually impaired during independent navigation by providing sensory augmentation, spatial information and general awareness of their environment. This paper focuses on the extended Usability and User Experience (UX) evaluation of BlindRouteVision, an outdoor navigation smartphone application that tries to efficiently solve problems related to the pedestrian navigation of visually impaired people without the aid of guides. The proposed system consists of an Android application that interacts with an external high-accuracy GPS sensor tracking pedestrian mobility in real-time, a second external device specifically designed to be mounted on traffic lights for identifying traffic light status and an ultrasonic sensor for detecting near-field obstacles along the route of the blind. Moreover, during outdoor navigation, it can optionally incorporate the use of Public Means of Transport, as well as provide multiple other uses such as dialing a call and notifying the current location in case of an emergency. We present findings from a Usability and UX standpoint of our proposed system conducted in the context of a pilot study, with 30 people having varying degrees of blindness. We also received feedback for improving both the available functionality of our application and the process by which the blind users learn the features of the application. The method of the study involved using standardized questionnaires and semi-structured interviews. The evaluation took place after the participants were exposed to the system’s functionality via specialized user-centered training sessions organized around a training version of the application that involves route simulation. The results indicate an overall positive attitude from the users.
Collapse
|
9
|
Kim K, Kim S, Choi A. Ultrasonic Sound Guide System with Eyeglass Device for the Visually Impaired. SENSORS 2022; 22:s22083077. [PMID: 35459062 PMCID: PMC9030799 DOI: 10.3390/s22083077] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Revised: 03/29/2022] [Accepted: 04/14/2022] [Indexed: 11/25/2022]
Abstract
The ultrasonic sound guide system presents the audio broadcasting system based on the inaudible ultrasonic sound to assist the indoor and outdoor navigation of the visually impaired. The transmitters are placed at the point of interest to propagate the frequency modulated voice signal in ultrasonic sound range. The dual channel receiver device is carried by the visually impaired person in the form of eyeglasses to receive the ultrasonic sound for the voice signal via demodulation. Since the ultrasonic sound demonstrates the acoustic properties, the velocity, directivity, attenuation, and superposition of ultrasonic sound provide the acoustic clue to the user for localizing the multiple transmitter positions by binaural localization capability. The visually impaired hear the designated voice signal and follow the signal attributions to arrive at the specific location. Due to the low microphone gain from side addressing, the time delay between the receiver channels demonstrates the high variance and high bias in end directions. However, the perception experiment shows the further prediction accuracy in end directions as compared to the center direction outcomes. The overall evaluations show the precise directional prediction for narrow- and wide-angle situations. The ultrasonic sound guide system is a useful device to localize places in the near field without touching braille.
Collapse
|
10
|
Outdoor Localization Using BLE RSSI and Accessible Pedestrian Signals for the Visually Impaired at Intersections. SENSORS 2022; 22:s22010371. [PMID: 35009910 PMCID: PMC8749544 DOI: 10.3390/s22010371] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/17/2021] [Revised: 12/16/2021] [Accepted: 12/28/2021] [Indexed: 11/17/2022]
Abstract
One of the major challenges for blind and visually impaired (BVI) people is traveling safely to cross intersections on foot. Many countries are now generating audible signals at crossings for visually impaired people to help with this problem. However, these accessible pedestrian signals can result in confusion for visually impaired people as they do not know which signal must be interpreted for traveling multiple crosses in complex road architecture. To solve this problem, we propose an assistive system called CAS (Crossing Assistance System) which extends the principle of the BLE (Bluetooth Low Energy) RSSI (Received Signal Strength Indicator) signal for outdoor and indoor location tracking and overcomes the intrinsic limitation of outdoor noise to enable us to locate the user effectively. We installed the system on a real-world intersection and collected a set of data for demonstrating the feasibility of outdoor RSSI tracking in a series of two studies. In the first study, our goal was to show the feasibility of using outdoor RSSI on the localization of four zones. We used a k-nearest neighbors (kNN) method and showed it led to 99.8% accuracy. In the second study, we extended our work to a more complex setup with nine zones, evaluated both the kNN and an additional method, a Support Vector Machine (SVM) with various RSSI features for classification. We found that the SVM performed best using the RSSI average, standard deviation, median, interquartile range (IQR) of the RSSI over a 5 s window. The best method can localize people with 97.7% accuracy. We conclude this paper by discussing how our system can impact navigation for BVI users in outdoor and indoor setups and what are the implications of these findings on the design of both wearable and traffic assistive technology for blind pedestrian navigation.
Collapse
|
11
|
Wearable Urban Mobility Assistive Device for Visually Impaired Pedestrians Using a Smartphone and a Tactile-Foot Interface. SENSORS 2021; 21:s21165274. [PMID: 34450714 PMCID: PMC8398265 DOI: 10.3390/s21165274] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Revised: 07/23/2021] [Accepted: 07/31/2021] [Indexed: 11/17/2022]
Abstract
This paper reports on the progress of a wearable assistive technology (AT) device designed to enhance the independent, safe, and efficient mobility of blind and visually impaired pedestrians in outdoor environments. Such device exploits the smartphone’s positioning and computing capabilities to locate and guide users along urban settings. The necessary navigation instructions to reach a destination are encoded as vibrating patterns which are conveyed to the user via a foot-placed tactile interface. To determine the performance of the proposed AT device, two user experiments were conducted. The first one requested a group of 20 voluntary normally sighted subjects to recognize the feedback provided by the tactile-foot interface. The results showed recognition rates over 93%. The second experiment involved two blind voluntary subjects which were assisted to find target destinations along public urban pathways. Results show that the subjects successfully accomplished the task and suggest that blind and visually impaired pedestrians might find the AT device and its concept approach useful, friendly, fast to master, and easy to use.
Collapse
|
12
|
Elgendy M, Sik-Lanyi C, Kelemen A. A Novel Marker Detection System for People with Visual Impairment Using the Improved Tiny-YOLOv3 Model. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 205:106112. [PMID: 33915507 DOI: 10.1016/j.cmpb.2021.106112] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/12/2020] [Accepted: 04/07/2021] [Indexed: 06/12/2023]
Abstract
BACKGROUND AND OBJECTIVE Daily activities such as shopping and navigating indoors are challenging problems for people with visual impairment. Researchers tried to find different solutions to help people with visual impairment navigate indoors and outdoors. METHODS We applied deep learning to help visually impaired people navigate indoors using markers. We propose a system to help them detect markers and navigate indoors using an improved Tiny-YOLOv3 model. A dataset was created by collecting marker images from recorded videos and augmenting them using image processing techniques such as rotation transformation, brightness, and blur processing. After training and validating this model, the performance was tested on a testing dataset and on real videos. RESULTS The contributions of this paper are: (1) We developed a navigation system to help people with visual impairment navigate indoors using markers; (2) We implemented and tested a deep learning model to detect Aruco markers in different challenging situations using Tiny-YOLOv3; (3) We implemented and compared several modified versions of the original model to improve detection accuracy. The modified Tiny-YOLOv3 model achieved an accuracy of 99.31% in challenging conditions and the original model achieved an accuracy of 96.11 %. CONCLUSION The training and testing results show that the improved Tiny-YOLOv3 models are superior to the original model.
Collapse
Affiliation(s)
- Mostafa Elgendy
- Department of Electrical Engineering and Information Systems, University of Pannonia, 8200 Veszprém, Hungary; Department of Computer Science, Faculty of Computers and Artificial Intelligence, Benha University, Benha 13511, Egypt.
| | - Cecilia Sik-Lanyi
- Department of Electrical Engineering and Information Systems, University of Pannonia, 8200 Veszprém, Hungary.
| | - Arpad Kelemen
- Department of Organizational Systems and Adult Health, University of Maryland, Baltimore, MD, 21201, USA.
| |
Collapse
|
13
|
El-taher FEZ, Taha A, Courtney J, Mckeever S. A Systematic Review of Urban Navigation Systems for Visually Impaired People. SENSORS (BASEL, SWITZERLAND) 2021; 21:3103. [PMID: 33946857 PMCID: PMC8125253 DOI: 10.3390/s21093103] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/31/2021] [Revised: 04/22/2021] [Accepted: 04/25/2021] [Indexed: 11/16/2022]
Abstract
Blind and Visually impaired people (BVIP) face a range of practical difficulties when undertaking outdoor journeys as pedestrians. Over the past decade, a variety of assistive devices have been researched and developed to help BVIP navigate more safely and independently. In addition, research in overlapping domains are addressing the problem of automatic environment interpretation using computer vision and machine learning, particularly deep learning, approaches. Our aim in this article is to present a comprehensive review of research directly in, or relevant to, assistive outdoor navigation for BVIP. We breakdown the navigation area into a series of navigation phases and tasks. We then use this structure for our systematic review of research, analysing articles, methods, datasets and current limitations by task. We also provide an overview of commercial and non-commercial navigation applications targeted at BVIP. Our review contributes to the body of knowledge by providing a comprehensive, structured analysis of work in the domain, including the state of the art, and guidance on future directions. It will support both researchers and other stakeholders in the domain to establish an informed view of research progress.
Collapse
Affiliation(s)
- Fatma El-zahraa El-taher
- School of Computer Science, Technological University Dublin, D07EWV4 Dublin, Ireland; (F.E.-z.E.-t.); (A.T.); (J.C.)
| | - Ayman Taha
- School of Computer Science, Technological University Dublin, D07EWV4 Dublin, Ireland; (F.E.-z.E.-t.); (A.T.); (J.C.)
- Faculty of Computers and Artificial Intelligence, Cairo University, Cairo 12613, Egypt
| | - Jane Courtney
- School of Computer Science, Technological University Dublin, D07EWV4 Dublin, Ireland; (F.E.-z.E.-t.); (A.T.); (J.C.)
| | - Susan Mckeever
- School of Computer Science, Technological University Dublin, D07EWV4 Dublin, Ireland; (F.E.-z.E.-t.); (A.T.); (J.C.)
| |
Collapse
|
14
|
A Multi-Agent System for Data Fusion Techniques Applied to the Internet of Things Enabling Physical Rehabilitation Monitoring. APPLIED SCIENCES-BASEL 2020. [DOI: 10.3390/app11010331] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
There are more than 800 million people in the world with chronic diseases. Many of these people do not have easy access to healthcare facilities for recovery. Telerehabilitation seeks to provide a solution to this problem. According to the researchers, the topic has been treated as medical aid, making an exchange between technological issues such as the Internet of Things and virtual reality. The main objective of this work is to design a distributed platform to monitor the patient’s movements and status during rehabilitation exercises. Later, this information can be processed and analyzed remotely by the doctor assigned to the patient. In this way, the doctor can follow the patient’s progress, enhancing the improvement and recovery process. To achieve this, a case study has been made using a PANGEA-based multi-agent system that coordinates different parts of the architecture using ubiquitous computing techniques. In addition, the system uses real-time feedback from the patient. This feedback system makes the patients aware of their errors so that they can improve their performance in later executions. An evaluation was carried out with real patients, achieving promising results.
Collapse
|
15
|
A Sensor Fusion Based Nonholonomic Wheeled Mobile Robot for Tracking Control. SENSORS 2020; 20:s20247055. [PMID: 33317173 PMCID: PMC7764409 DOI: 10.3390/s20247055] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/30/2020] [Revised: 12/03/2020] [Accepted: 12/04/2020] [Indexed: 11/25/2022]
Abstract
In this paper, a detail design procedure of the real-time trajectory tracking for the nonholonomic wheeled mobile robot (NWMR) is proposed. A 9-axis micro electro-mechanical systems (MEMS) inertial measurement unit (IMU) sensor is used to measure the posture of the NWMR, the position information of NWMR and the hand-held device are acquired by global positioning system (GPS) and then transmit via radio frequency (RF) module. In addition, in order to avoid the gimbal lock produced by the posture computation from Euler angles, the quaternion is utilized to compute the posture of the NWMR. Furthermore, the Kalman filter is used to filter out the readout noise of the GPS and calculate the position of NWMR and then track the object. The simulation results show the posture error between the NWMR and the hand-held device can converge to zero after 3.928 seconds for the dynamic tracking. Lastly, the experimental results show the validation and feasibility of the proposed results.
Collapse
|
16
|
Solar-Powered Deep Learning-Based Recognition System of Daily Used Objects and Human Faces for Assistance of the Visually Impaired. ENERGIES 2020. [DOI: 10.3390/en13226104] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
This paper introduces a novel low-cost solar-powered wearable assistive technology (AT) device, whose aim is to provide continuous, real-time object recognition to ease the finding of the objects for visually impaired (VI) people in daily life. The system consists of three major components: a miniature low-cost camera, a system on module (SoM) computing unit, and an ultrasonic sensor. The first is worn on the user’s eyeglasses and acquires real-time video of the nearby space. The second is worn as a belt and runs deep learning-based methods and spatial algorithms which process the video coming from the camera performing objects’ detection and recognition. The third assists on positioning the objects found in the surrounding space. The developed device provides audible descriptive sentences as feedback to the user involving the objects recognized and their position referenced to the user gaze. After a proper power consumption analysis, a wearable solar harvesting system, integrated with the developed AT device, has been designed and tested to extend the energy autonomy in the different operating modes and scenarios. Experimental results obtained with the developed low-cost AT device have demonstrated an accurate and reliable real-time object identification with an 86% correct recognition rate and 215 ms average time interval (in case of high-speed SoM operating mode) for the image processing. The proposed system is capable of recognizing the 91 objects offered by the Microsoft Common Objects in Context (COCO) dataset plus several custom objects and human faces. In addition, a simple and scalable methodology for using image datasets and training of Convolutional Neural Networks (CNNs) is introduced to add objects to the system and increase its repertory. It is also demonstrated that comprehensive trainings involving 100 images per targeted object achieve 89% recognition rates, while fast trainings with only 12 images achieve acceptable recognition rates of 55%.
Collapse
|
17
|
Development of Test Equipment for Pedestrian-Automatic Emergency Braking Based on C-NCAP (2018). SENSORS 2020; 20:s20216206. [PMID: 33143319 PMCID: PMC7663537 DOI: 10.3390/s20216206] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/07/2020] [Revised: 09/22/2020] [Accepted: 09/23/2020] [Indexed: 12/04/2022]
Abstract
In order to evaluate the effectiveness of a pedestrian-automatic emergency braking (PAEB) system on pedestrian protection, a set of PAEB test equipment was developed according to the test requirement of China-New Car Assessment Program (C-NCAP) (2018) in this study. In the aspect of system control strategy, global positioning system (GPS) differential positioning was used to achieve the required measurement and positioning accuracy, the collaborative control between the PAEB test equipment and automated driving robot (ADR) was achieved by wireless communication, and the motion state of the dummy target in the PAEB system was controlled by using the S-shaped-curve velocity control method. Part of the simulations and field tests were conducted according to the scenario requirements specified in C-NCAP (2018). The experimental and simulated results showed that the test equipment demonstrated high accuracy and precision in the process of testing, the dummy target movement was smooth and stable, complying with the requirements of PAEB tests set forth in C-NCAP (2018), and yielding satisfactory results as designed. Subsequently, the performance of the AEB of a vehicle under test (VUT) was conducted and the score for star-rating to evaluate the performance level of AEB calculated. Results indicated the developed test equipment in this study could be used to evaluate the performance of the PAEB system with effectiveness.
Collapse
|
18
|
Griffin E, Picinali L, Scase M. The effectiveness of an interactive audio-tactile map for the process of cognitive mapping and recall among people with visual impairments. Brain Behav 2020; 10:e01650. [PMID: 32445295 PMCID: PMC7375097 DOI: 10.1002/brb3.1650] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/03/2019] [Revised: 03/19/2020] [Accepted: 03/21/2020] [Indexed: 11/11/2022] Open
Abstract
BACKGROUND People with visual impairments can experience numerous challenges navigating unfamiliar environments. Systems that operate as prenavigation tools can assist such individuals. This mixed-methods study examined the effectiveness of an interactive audio-tactile map tool on the process of cognitive mapping and recall, among people who were blind or had visual impairments. The tool was developed with the involvement of visually impaired individuals who additionally provided further feedback throughout this research. METHODS A mixed-methods experimental design was employed. Fourteen participants were allocated to either an experimental group who were exposed to an audio-tactile map, or a control group exposed to a verbally annotated tactile map. After five minutes' exposure, multiple-choice questions examined participants' recall of the spatial and navigational content. Subsequent semi-structured interviews were conducted to examine their views surrounding the study and the product. RESULTS The experimental condition had significantly better overall recall than the control group and higher average scores in all four areas examined by the questions. The interviews suggested that the interactive component offered individuals the freedom to learn the map in several ways and did not restrict them to a sequential and linear approach to learning. CONCLUSION Assistive technology can reduce challenges faced by people with visual impairments, and the flexible learning approach offered by the audio-tactile map may be of particular value. Future researchers and assistive technology developers may wish to explore this further.
Collapse
Affiliation(s)
- Edward Griffin
- School of Nursing and Midwifery, De Montfort University, Leicester, UK
| | - Lorenzo Picinali
- Dyson School of Design Engineering, Imperial College London, London, UK
| | - Mark Scase
- Division of Psychology, De Montfort University, Leicester, UK
| |
Collapse
|
19
|
Satpute SA, Canady JR, Klatzky RL, Stetten GD. FingerSight: A Vibrotactile Wearable Ring for Assistance With Locating and Reaching Objects in Peripersonal Space. IEEE TRANSACTIONS ON HAPTICS 2020; 13:325-333. [PMID: 31603801 DOI: 10.1109/toh.2019.2945561] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2023]
Abstract
This paper describes a prototype guidance system, "FingerSight," to help people without vision locate and reach to objects in peripersonal space. It consists of four evenly spaced tactors embedded into a ring worn on the index finger, with a small camera mounted on top. Computer-vision analysis of the camera image controls vibrotactile feedback, leading users to move their hand to near targets. Two experiments tested the functionality of the prototype system. The first found that participants could discriminate between five different vibrotactile sites (four individual tactors and all simultaneously) with a mean accuracy of 88.8% after initial training. In the second experiment, participants were blindfolded and instructed to move their hand wearing the device to one of four locations within arm's reach, while hand trajectories were tracked. The tactors were controlled using two different strategies: (1) repeatedly signal axis with largest error, and (2) signal both axes in alternation. Participants demonstrated essentially straight-line trajectories toward the target under both instructions, but the temporal parameters (rate of approach, duration) showed an advantage for correction on both axes in sequence.
Collapse
|
20
|
Identification of Markers in Challenging Conditions for People with Visual Impairment Using Convolutional Neural Network. APPLIED SCIENCES-BASEL 2019. [DOI: 10.3390/app9235110] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
People with visual impairment face a lot of difficulties in their daily activities. Several researches have been conducted to find smart solutions using mobile devices to help people with visual impairment perform tasks. This paper focuses on using assistive technology to help people with visual impairment in indoor navigation using markers. The essential steps of a typical navigation system are identifying the current location, finding the shortest path to the destination, and navigating safely to the destination using navigation feedback. In this research, the authors proposed a system to help people with visual impairment in indoor navigation using markers. In this system, the authors have re-defined the identification step to a classification problem and used convolutional neural networks to identify markers. The main contributions of this paper are: (1) A system to help people with visual impairment in indoor navigation using markers. (2) Comparing QR codes with Aruco markers to prove that Aruco markers work better. (3) Convolutional neural network has been implemented and simplified to detect the candidate markers in challenging conditions and improve response time. (4) Comparing the proposed model with another model to prove that it gives better accuracy for training and testing.
Collapse
|
21
|
Zhang X, Zhang H, Zhang L, Zhu Y, Hu F. Double-Diamond Model-Based Orientation Guidance in Wearable Human-Machine Navigation Systems for Blind and Visually Impaired People. SENSORS 2019; 19:s19214670. [PMID: 31661798 PMCID: PMC6864851 DOI: 10.3390/s19214670] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/31/2019] [Revised: 10/06/2019] [Accepted: 10/24/2019] [Indexed: 11/17/2022]
Abstract
This paper presents the analysis and design of a new, wearable orientation guidance device in modern travel aid systems for blind and visually impaired people. The four-stage double-diamond design model was applied in the design process to achieve human-centric innovation and to ensure technical feasibility and economic viability. Consequently, a sliding tactile feedback wristband was designed and prototyped. Furthermore, a Bezier curve-based adaptive path planner is proposed to guarantee collision-free planned motion. Proof-of-concept experiments on both virtual and real-world scenarios are conducted. The evaluation results confirmed the efficiency and feasibility of the design and imply the design’s remarkable potential in spatial perception rehabilitation.
Collapse
Affiliation(s)
- Xiaochen Zhang
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510006, China.
| | - Hui Zhang
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510006, China.
| | - Linyue Zhang
- School of Communication and Design, Sun Yat-Sen University, Guangzhou 510275, China.
| | - Yi Zhu
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510006, China.
- School of Industrial Design, Georgia Institute of Technology, Atlanta, GA 30332, USA.
| | - Fei Hu
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510006, China.
| |
Collapse
|
22
|
Silva MC, Amorim VJP, Ribeiro SP, Oliveira RAR. Field Research Cooperative Wearable Systems: Challenges in Requirements, Design and Validation. SENSORS (BASEL, SWITZERLAND) 2019; 19:E4417. [PMID: 31614802 PMCID: PMC6832741 DOI: 10.3390/s19204417] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/30/2019] [Revised: 10/06/2019] [Accepted: 10/08/2019] [Indexed: 12/13/2022]
Abstract
The widespread availability of wearable devices is evolving them into cooperative systems. Communication and distribution aspects such as the Internet of Things, Wireless Body Area Networks, and Local Wireless Networks provide the means to develop multi-device platforms. Nevertheless, the field research environment presents a specific feature set, which increases the difficulty in the adoption of this technology. In this text, we review the main aspects of Field Research Gears and Wearable Devices. This review is made aiming to understand how to create cooperative systems based on wearable devices directed to the Field Research Context. For a better understanding, we developed a case study in which we propose a cooperative system architecture and provide validation aspects. For this matter, we provide an overview of a previous device architecture and study an integration proposal.
Collapse
Affiliation(s)
- Mateus C Silva
- Department of Computing, Federal University of Ouro Preto, Morro do Cruzeiro Campus, Ouro Preto 35400-000, Brazil.
| | - Vicente J P Amorim
- Department of Computing, Federal University of Ouro Preto, Morro do Cruzeiro Campus, Ouro Preto 35400-000, Brazil.
| | - Sérvio P Ribeiro
- Department of Biology, Federal University of Ouro Preto, Morro do Cruzeiro Campus, Ouro Preto 35400-000, Brazil.
| | - Ricardo A R Oliveira
- Department of Computing, Federal University of Ouro Preto, Morro do Cruzeiro Campus, Ouro Preto 35400-000, Brazil.
| |
Collapse
|
23
|
An Industrial Micro-Defect Diagnosis System via Intelligent Segmentation Region. SENSORS 2019; 19:s19112636. [PMID: 31212594 PMCID: PMC6603651 DOI: 10.3390/s19112636] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/08/2019] [Revised: 06/05/2019] [Accepted: 06/06/2019] [Indexed: 11/25/2022]
Abstract
In the field of machine vision defect detection for a micro workpiece, it is very important to make the neural network realize the integrity of the mask in analyte segmentation regions. In the process of the recognition of small workpieces, fatal defects are always contained in borderline areas that are difficult to demarcate. The non-maximum suppression (NMS) of intersection over union (IOU) will lose crucial texture information especially in the clutter and occlusion detection areas. In this paper, simple linear iterative clustering (SLIC) is used to augment the mask as well as calibrate the score of the mask. We propose an SLIC head of object instance segmentation in proposal regions (Mask R-CNN) containing a network block to learn the quality of the predict masks. It is found that parallel K-means in the limited region mechanism in the SLIC head improved the confidence of the mask score, in the context of our workpiece. A continuous fine-tune mechanism was utilized to continuously improve the model robustness in a large-scale production line. We established a detection system, which included an optical fiber locator, telecentric lens system, matrix stereoscopic light, a rotating platform, and a neural network with an SLIC head. The accuracy of defect detection is effectively improved for micro workpieces with clutter and borderline areas.
Collapse
|
24
|
Active and Passive Haptic Perception of Shape: Passive Haptics Can Support Navigation. ELECTRONICS 2019. [DOI: 10.3390/electronics8030355] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Real-time haptic interactions occur under two exploration modes: active and passive. In this paper, we present a series of experiments that evaluate the main perceptual characteristics of both exploration modes. In particular, we focus on haptic shape recognition as it represents a fundamental task in many applications using haptic environments. The results of four experiments conducted with a group of 10 voluntary subjects show that the differences in motor activity between active and passive haptics ease the perception of surfaces for the first case and the perception of pathways for the latter. In addition, the guidance nature of passive haptics makes the pathway direction easy to recognize. This work shows that this last observation could find application in more challenging tasks such as navigation in space.
Collapse
|