{"title":"为视障人士设计的基于情境的室内寻路系统","authors":"Eunjeong Ko, Jinsun Ju, Eun Yi Kim","doi":"10.1145/2049536.2049545","DOIUrl":null,"url":null,"abstract":"This paper presents an indoor wayfinding system to help the visually impaired finding their way to a given destination in an unfamiliar environment. The main novelty is the use of the user's situation as the basis for designing color codes to explain the environmental information and for developing the wayfinding system to detect and recognize such color codes. Actually, people would require different information according to their situations. Therefore, situation-based color codes are designed, including location-specific codes and guide codes. These color codes are affixed in certain locations to provide information to the visually impaired, and their location and meaning are then recognized using the proposed wayfinding system. Consisting of three steps, the proposed wayfinding system first recognizes the current situation using a vocabulary tree that is built on the shape properties of images taken of various situations. Next, it detects and recognizes the necessary codes according to the current situation, based on color and edge information. Finally, it provides the user with environmental information and their path through an auditory interface. To assess the validity of the proposed wayfinding system, we have conducted field test with four visually impaired, then the results showed that they can find the optimal path in real-time with an accuracy of 95%.","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"58 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":"{\"title\":\"Situation-based indoor wayfinding system for the visually impaired\",\"authors\":\"Eunjeong Ko, Jinsun Ju, Eun Yi Kim\",\"doi\":\"10.1145/2049536.2049545\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents an indoor wayfinding system to help the visually impaired finding their way to a given destination in an unfamiliar environment. The main novelty is the use of the user's situation as the basis for designing color codes to explain the environmental information and for developing the wayfinding system to detect and recognize such color codes. Actually, people would require different information according to their situations. Therefore, situation-based color codes are designed, including location-specific codes and guide codes. These color codes are affixed in certain locations to provide information to the visually impaired, and their location and meaning are then recognized using the proposed wayfinding system. Consisting of three steps, the proposed wayfinding system first recognizes the current situation using a vocabulary tree that is built on the shape properties of images taken of various situations. Next, it detects and recognizes the necessary codes according to the current situation, based on color and edge information. Finally, it provides the user with environmental information and their path through an auditory interface. To assess the validity of the proposed wayfinding system, we have conducted field test with four visually impaired, then the results showed that they can find the optimal path in real-time with an accuracy of 95%.\",\"PeriodicalId\":351090,\"journal\":{\"name\":\"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility\",\"volume\":\"58 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-10-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2049536.2049545\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2049536.2049545","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Situation-based indoor wayfinding system for the visually impaired
This paper presents an indoor wayfinding system to help the visually impaired finding their way to a given destination in an unfamiliar environment. The main novelty is the use of the user's situation as the basis for designing color codes to explain the environmental information and for developing the wayfinding system to detect and recognize such color codes. Actually, people would require different information according to their situations. Therefore, situation-based color codes are designed, including location-specific codes and guide codes. These color codes are affixed in certain locations to provide information to the visually impaired, and their location and meaning are then recognized using the proposed wayfinding system. Consisting of three steps, the proposed wayfinding system first recognizes the current situation using a vocabulary tree that is built on the shape properties of images taken of various situations. Next, it detects and recognizes the necessary codes according to the current situation, based on color and edge information. Finally, it provides the user with environmental information and their path through an auditory interface. To assess the validity of the proposed wayfinding system, we have conducted field test with four visually impaired, then the results showed that they can find the optimal path in real-time with an accuracy of 95%.