{"title":"视障人士在受限空间中导航的传感器融合","authors":"C. S. Silva, P. Wimalaratne","doi":"10.1109/ICIAFS.2016.7946537","DOIUrl":null,"url":null,"abstract":"This work presents a multi-sensor fusion approach for an electronic navigation aid for the blind and visually impaired persons. This approach proposes to intelligently fuse the surrounding information senses via ultrasonic sensors and vision sensors. The intelligent component of the prototype serves in several facets including object detection and recognition. Extended Kalman filter is used to fuse the data emerging from homogeneous sensors and rule-based fusion is used to fuse the data from heterogeneous sensors. Feedback is provided via tactile and audio feedback. Critical obstacles of blind navigation like staircases are recognized by the Hough line detection in image processing. Rotations, which occur due to the body movements in the camera, correct using the fusion of data obtain by the inertial measurement unit which is connected to the camera. The results of the evaluations proved that the use of fusion of multiple homogeneous sensors improve the detection of a particular obstacle and fusion of vision and ultrasonic sensors improve the object detection identification of the obstacles. The current status of the work and the future developments are presented in this paper.","PeriodicalId":237290,"journal":{"name":"2016 IEEE International Conference on Information and Automation for Sustainability (ICIAfS)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"Sensor fusion for visually impaired navigation in constrained spaces\",\"authors\":\"C. S. Silva, P. Wimalaratne\",\"doi\":\"10.1109/ICIAFS.2016.7946537\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This work presents a multi-sensor fusion approach for an electronic navigation aid for the blind and visually impaired persons. This approach proposes to intelligently fuse the surrounding information senses via ultrasonic sensors and vision sensors. The intelligent component of the prototype serves in several facets including object detection and recognition. Extended Kalman filter is used to fuse the data emerging from homogeneous sensors and rule-based fusion is used to fuse the data from heterogeneous sensors. Feedback is provided via tactile and audio feedback. Critical obstacles of blind navigation like staircases are recognized by the Hough line detection in image processing. Rotations, which occur due to the body movements in the camera, correct using the fusion of data obtain by the inertial measurement unit which is connected to the camera. The results of the evaluations proved that the use of fusion of multiple homogeneous sensors improve the detection of a particular obstacle and fusion of vision and ultrasonic sensors improve the object detection identification of the obstacles. The current status of the work and the future developments are presented in this paper.\",\"PeriodicalId\":237290,\"journal\":{\"name\":\"2016 IEEE International Conference on Information and Automation for Sustainability (ICIAfS)\",\"volume\":\"27 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE International Conference on Information and Automation for Sustainability (ICIAfS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICIAFS.2016.7946537\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE International Conference on Information and Automation for Sustainability (ICIAfS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIAFS.2016.7946537","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sensor fusion for visually impaired navigation in constrained spaces
This work presents a multi-sensor fusion approach for an electronic navigation aid for the blind and visually impaired persons. This approach proposes to intelligently fuse the surrounding information senses via ultrasonic sensors and vision sensors. The intelligent component of the prototype serves in several facets including object detection and recognition. Extended Kalman filter is used to fuse the data emerging from homogeneous sensors and rule-based fusion is used to fuse the data from heterogeneous sensors. Feedback is provided via tactile and audio feedback. Critical obstacles of blind navigation like staircases are recognized by the Hough line detection in image processing. Rotations, which occur due to the body movements in the camera, correct using the fusion of data obtain by the inertial measurement unit which is connected to the camera. The results of the evaluations proved that the use of fusion of multiple homogeneous sensors improve the detection of a particular obstacle and fusion of vision and ultrasonic sensors improve the object detection identification of the obstacles. The current status of the work and the future developments are presented in this paper.