M. Bailey, A. Chanler, B. Maxwell, M. Micire, K. Tsui, H. Yanco
{"title":"基于视觉的机器人轮椅导航系统研究","authors":"M. Bailey, A. Chanler, B. Maxwell, M. Micire, K. Tsui, H. Yanco","doi":"10.1109/ICORR.2007.4428538","DOIUrl":null,"url":null,"abstract":"Our environment is replete with visual cues intended to guide human navigation. For example, there are building directories at entrances and room numbers next to doors. By developing a robot wheelchair system that can interpret these cues, we will create a more robust and more usable system. This paper describes the design and development of our robot wheelchair system, called Wheeley, and its vision-based navigation system. The robot wheelchair system uses stereo vision to build maps of the environment through which it travels; this map can then be annotated with information gleaned from signs. We also describe the planned integration of an assistive robot arm to help with pushing elevator buttons and opening door handles.","PeriodicalId":197465,"journal":{"name":"2007 IEEE 10th International Conference on Rehabilitation Robotics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2007-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"45","resultStr":"{\"title\":\"Development of Vision-Based Navigation for a Robotic Wheelchair\",\"authors\":\"M. Bailey, A. Chanler, B. Maxwell, M. Micire, K. Tsui, H. Yanco\",\"doi\":\"10.1109/ICORR.2007.4428538\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Our environment is replete with visual cues intended to guide human navigation. For example, there are building directories at entrances and room numbers next to doors. By developing a robot wheelchair system that can interpret these cues, we will create a more robust and more usable system. This paper describes the design and development of our robot wheelchair system, called Wheeley, and its vision-based navigation system. The robot wheelchair system uses stereo vision to build maps of the environment through which it travels; this map can then be annotated with information gleaned from signs. We also describe the planned integration of an assistive robot arm to help with pushing elevator buttons and opening door handles.\",\"PeriodicalId\":197465,\"journal\":{\"name\":\"2007 IEEE 10th International Conference on Rehabilitation Robotics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-06-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"45\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2007 IEEE 10th International Conference on Rehabilitation Robotics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICORR.2007.4428538\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 IEEE 10th International Conference on Rehabilitation Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICORR.2007.4428538","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Development of Vision-Based Navigation for a Robotic Wheelchair
Our environment is replete with visual cues intended to guide human navigation. For example, there are building directories at entrances and room numbers next to doors. By developing a robot wheelchair system that can interpret these cues, we will create a more robust and more usable system. This paper describes the design and development of our robot wheelchair system, called Wheeley, and its vision-based navigation system. The robot wheelchair system uses stereo vision to build maps of the environment through which it travels; this map can then be annotated with information gleaned from signs. We also describe the planned integration of an assistive robot arm to help with pushing elevator buttons and opening door handles.