Sensors, Vol. 21, Pages 1536: A Wearable Navigation Device for Visually Impaired People Based on the Real-Time Semantic Visual SLAM System

Sensors, Vol. 21, Pages 1536: A Wearable Navigation Device for Visually Impaired People Based on the Real-Time Semantic Visual SLAM System Sensors doi: 10.3390/s21041536 Authors: Chen Liu Kojima Huang Arai Wearable auxiliary devices for visually impaired people are highly attractive research topics. Although many proposed wearable navigation devices can assist visually impaired people in obstacle avoidance and navigation, these devices cannot feedback detailed information about the obstacles or help the visually impaired understand the environment. In this paper, we proposed a wearable navigation device for the visually impaired by integrating the semantic visual SLAM (Simultaneous Localization And Mapping) and the newly launched powerful mobile computing platform. This system uses an Image-Depth (RGB-D) camera based on structured light as the sensor, as the control center. We also focused on the technology that combines SLAM technology with the extraction of semantic information from the environment. It ensures that the computing platform understands the surrounding environment in real-time and can feed it back to the visually impaired in the form of voice broadcast. Finally, we tested the performance of the proposed semantic visual SLAM system on this device. The results indicate that the system can run in real-time on a wearable navigation device with sufficient accuracy.
Source: Sensors - Category: Biotechnology Authors: Tags: Article Source Type: research