Pub Date : 2010-07-26DOI: 10.1109/SSRR.2010.5981571
T. Wiemann, A. Nuchter, K. Lingemann, Stefan Stiene, J. Hertzberg
This paper presents a novel approach to create polygonal maps from 3D point cloud data. The gained map is augmented with an interpretation of the scene. Our procedure produces accurate maps of indoor environments fast and reliably. These maps are successfully used by different robots with varying sensor configurations for reliable self localization.
{"title":"Automatic construction of polygonal maps from point cloud data","authors":"T. Wiemann, A. Nuchter, K. Lingemann, Stefan Stiene, J. Hertzberg","doi":"10.1109/SSRR.2010.5981571","DOIUrl":"https://doi.org/10.1109/SSRR.2010.5981571","url":null,"abstract":"This paper presents a novel approach to create polygonal maps from 3D point cloud data. The gained map is augmented with an interpretation of the scene. Our procedure produces accurate maps of indoor environments fast and reliably. These maps are successfully used by different robots with varying sensor configurations for reliable self localization.","PeriodicalId":371261,"journal":{"name":"2010 IEEE Safety Security and Rescue Robotics","volume":"94 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121444807","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-07-26DOI: 10.1109/SSRR.2010.5981570
R. Murphy
This paper summarizes the findings and observations from the NSF-JST-NIST Workshop on Rescue Robotics held at Texas A&M, March 8-11, 2010. The 50 workshop participants represented sixteen universities in the USA, Japan, and China. Over a dozen land, marine, and aerial vehicles were tested using the Response Robot Evaluation Exercise #6 or in more scenario-oriented testing at Disaster City®. The workshop produced nine recommendations for standards for unmanned vehicles as well as proposed four topics for human-robot interaction evaluation. While the workshop generated tangible contributions to the ASTM Standards meeting and individual research programs, the workshop illustrated the benefits of interacting with more responders, robots, and other researchers than has been previously available to the community.
{"title":"Findings from NSF-JST-NIST Workshop on Rescue Robotics","authors":"R. Murphy","doi":"10.1109/SSRR.2010.5981570","DOIUrl":"https://doi.org/10.1109/SSRR.2010.5981570","url":null,"abstract":"This paper summarizes the findings and observations from the NSF-JST-NIST Workshop on Rescue Robotics held at Texas A&M, March 8-11, 2010. The 50 workshop participants represented sixteen universities in the USA, Japan, and China. Over a dozen land, marine, and aerial vehicles were tested using the Response Robot Evaluation Exercise #6 or in more scenario-oriented testing at Disaster City®. The workshop produced nine recommendations for standards for unmanned vehicles as well as proposed four topics for human-robot interaction evaluation. While the workshop generated tangible contributions to the ASTM Standards meeting and individual research programs, the workshop illustrated the benefits of interacting with more responders, robots, and other researchers than has been previously available to the community.","PeriodicalId":371261,"journal":{"name":"2010 IEEE Safety Security and Rescue Robotics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131129897","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-07-26DOI: 10.1109/SSRR.2010.5981558
Hai Dan, Zhang Hui, Xiao Junhao, Zheng Zhiqiang
Localization is an essential problem for application of Wireless Sensor Network (WSN). We introduce a mobile robot to localize the sensor nodes in cooperative way. Specifically, we consider sensors would provide range measurements between themselves. The localization strategy was set to be a mixed form of centralized and distributed manner considering the character of robot and WSN. The algorithm can efficiently fuse multiple measurements of the system to form conservative covariance estimates and avoid over-confident problem properly. The algorithm was described in a Bayes framework and proved to be efficient in simulation result.
{"title":"Cooperate localization of a Wireless Sensor Network (WSN) aided by a mobile robot","authors":"Hai Dan, Zhang Hui, Xiao Junhao, Zheng Zhiqiang","doi":"10.1109/SSRR.2010.5981558","DOIUrl":"https://doi.org/10.1109/SSRR.2010.5981558","url":null,"abstract":"Localization is an essential problem for application of Wireless Sensor Network (WSN). We introduce a mobile robot to localize the sensor nodes in cooperative way. Specifically, we consider sensors would provide range measurements between themselves. The localization strategy was set to be a mixed form of centralized and distributed manner considering the character of robot and WSN. The algorithm can efficiently fuse multiple measurements of the system to form conservative covariance estimates and avoid over-confident problem properly. The algorithm was described in a Bayes framework and proved to be efficient in simulation result.","PeriodicalId":371261,"journal":{"name":"2010 IEEE Safety Security and Rescue Robotics","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127318704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-07-26DOI: 10.1109/SSRR.2010.5981550
Thorsten Linder, V. Tretyakov, Sebastian Blumenthal, P. Molitor, D. Holz, Robin R. Murphy, S. Tadokoro, H. Surmann
This paper presents a field report and summarizes the problems of the appliance of rescue robots during the Collapse of the Historical Archive of the City of Cologne. Two robots where on the field, ready to be applied: A shoe-box size tracked mobile robot (VGTV Xtreme) and a caterpillar like system (Active Scope Camera). Due to the special type of collapse and design limitations of the robots, both robotic systems could not be applied. Either they could not reach/fit into voids or could not be controlled from a safe distance. The problems faced have been analyzed and are described in this paper.
{"title":"Rescue robots at the Collapse of the municipal archive of Cologne City: A field report","authors":"Thorsten Linder, V. Tretyakov, Sebastian Blumenthal, P. Molitor, D. Holz, Robin R. Murphy, S. Tadokoro, H. Surmann","doi":"10.1109/SSRR.2010.5981550","DOIUrl":"https://doi.org/10.1109/SSRR.2010.5981550","url":null,"abstract":"This paper presents a field report and summarizes the problems of the appliance of rescue robots during the Collapse of the Historical Archive of the City of Cologne. Two robots where on the field, ready to be applied: A shoe-box size tracked mobile robot (VGTV Xtreme) and a caterpillar like system (Active Scope Camera). Due to the special type of collapse and design limitations of the robots, both robotic systems could not be applied. Either they could not reach/fit into voids or could not be controlled from a safe distance. The problems faced have been analyzed and are described in this paper.","PeriodicalId":371261,"journal":{"name":"2010 IEEE Safety Security and Rescue Robotics","volume":"81 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125692902","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-07-26DOI: 10.1109/SSRR.2010.5981553
Fernando J. Pereda, Héctor Garcia de Marina, J. Jiménez, J. Girón-Sierra
A sea demining method using autonomous marine surface vehicles (AMSV) is introduced. The method involves the development of copies of these vehicles, and the procedures for area scanning and coverage. The demining is made by field influence, towing a submerged “fish”. This study is made with both simulations and scale experiments.
{"title":"Sea demining with autonomous marine surface vehicles","authors":"Fernando J. Pereda, Héctor Garcia de Marina, J. Jiménez, J. Girón-Sierra","doi":"10.1109/SSRR.2010.5981553","DOIUrl":"https://doi.org/10.1109/SSRR.2010.5981553","url":null,"abstract":"A sea demining method using autonomous marine surface vehicles (AMSV) is introduced. The method involves the development of copies of these vehicles, and the procedures for area scanning and coverage. The demining is made by field influence, towing a submerged “fish”. This study is made with both simulations and scale experiments.","PeriodicalId":371261,"journal":{"name":"2010 IEEE Safety Security and Rescue Robotics","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121735820","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-07-26DOI: 10.1109/SSRR.2010.5981566
M. Zaheer Aziz, B. Mertsching
Rescue robotics is gaining increasing attention as an application of human assistance machines. Robots (or unmanned ground vehicles) for searching victims in disaster sites require a high level of reliability and robustness. This paper proposes a mobile vision system able to search for victims using integration of video and thermal camera inputs. A time efficient scanning strategy is designed for the pan-tilt camera head with a biologically inspired robust search mechanism to improve the performance of the overall system. Results show success of the proposed methodology in enhancing robustness in autonomous survivor detection.
{"title":"Survivor search with autonomous UGVs using multimodal overt attention","authors":"M. Zaheer Aziz, B. Mertsching","doi":"10.1109/SSRR.2010.5981566","DOIUrl":"https://doi.org/10.1109/SSRR.2010.5981566","url":null,"abstract":"Rescue robotics is gaining increasing attention as an application of human assistance machines. Robots (or unmanned ground vehicles) for searching victims in disaster sites require a high level of reliability and robustness. This paper proposes a mobile vision system able to search for victims using integration of video and thermal camera inputs. A time efficient scanning strategy is designed for the pan-tilt camera head with a biologically inspired robust search mechanism to improve the performance of the overall system. Results show success of the proposed methodology in enhancing robustness in autonomous survivor detection.","PeriodicalId":371261,"journal":{"name":"2010 IEEE Safety Security and Rescue Robotics","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125141096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-07-26DOI: 10.1109/SSRR.2010.5981572
Bin Li, Shugen Ma, Tonglin Liu, Minghui Wang
A shape-shifting mobile robot named AMOEBA-I has diverse configurations. Cooperative reconfiguration is presented to improve the robot's reconfigurable ability and adaptability in unstructured environments. Cooperative reconfiguration method is analyzed theoretically. A mathematical model is established correspondingly and the kinematical relations among the three modules during the cooperative reconfiguration are determined. The transformation is implemented between two specific configurations. Then, an evaluation is proposed for AMOEBA-I's cooperative reconfiguration performance. The feature of cooperative reconfiguration methods is compared through the theoretical analysis and experiments. Experimental results prove the validity of the cooperative reconfiguration methods on various grounds.
{"title":"Cooperative reconfiguration between two specific configurations for a shape-shifting robot","authors":"Bin Li, Shugen Ma, Tonglin Liu, Minghui Wang","doi":"10.1109/SSRR.2010.5981572","DOIUrl":"https://doi.org/10.1109/SSRR.2010.5981572","url":null,"abstract":"A shape-shifting mobile robot named AMOEBA-I has diverse configurations. Cooperative reconfiguration is presented to improve the robot's reconfigurable ability and adaptability in unstructured environments. Cooperative reconfiguration method is analyzed theoretically. A mathematical model is established correspondingly and the kinematical relations among the three modules during the cooperative reconfiguration are determined. The transformation is implemented between two specific configurations. Then, an evaluation is proposed for AMOEBA-I's cooperative reconfiguration performance. The feature of cooperative reconfiguration methods is compared through the theoretical analysis and experiments. Experimental results prove the validity of the cooperative reconfiguration methods on various grounds.","PeriodicalId":371261,"journal":{"name":"2010 IEEE Safety Security and Rescue Robotics","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128722575","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-07-26DOI: 10.1109/SSRR.2010.5981567
J. Pellenz, D. Lang, F. Neuhaus, D. Paulus
Mobile systems for mapping and terrain classification are often tested on datasets of intact environments only. The behavior of the algorithms in unstructured environments is mostly unknown. In safety, security and rescue environments, the robots have to handle much rougher terrain. Therefore, there is a need for 3D test data that also contains disaster scenarios. During the Response Robot Evaluation Exercise in March 2010 in Disaster City, College Station, Texas (USA), a comprehensive dataset was recorded containing the data of a 3D laser range finder, a GPS receiver, an IMU and a color camera. We tested our algorithms (for terrain classification and 3D mapping) with the dataset, and will make the data available to give other researchers the chance to do the same. We believe that this captured data of this well known location provides a valuable dataset for the USAR robotics community, increasing chances of getting more comparable results.
{"title":"Real-time 3D mapping of rough terrain: A field report from Disaster City","authors":"J. Pellenz, D. Lang, F. Neuhaus, D. Paulus","doi":"10.1109/SSRR.2010.5981567","DOIUrl":"https://doi.org/10.1109/SSRR.2010.5981567","url":null,"abstract":"Mobile systems for mapping and terrain classification are often tested on datasets of intact environments only. The behavior of the algorithms in unstructured environments is mostly unknown. In safety, security and rescue environments, the robots have to handle much rougher terrain. Therefore, there is a need for 3D test data that also contains disaster scenarios. During the Response Robot Evaluation Exercise in March 2010 in Disaster City, College Station, Texas (USA), a comprehensive dataset was recorded containing the data of a 3D laser range finder, a GPS receiver, an IMU and a color camera. We tested our algorithms (for terrain classification and 3D mapping) with the dataset, and will make the data available to give other researchers the chance to do the same. We believe that this captured data of this well known location provides a valuable dataset for the USAR robotics community, increasing chances of getting more comparable results.","PeriodicalId":371261,"journal":{"name":"2010 IEEE Safety Security and Rescue Robotics","volume":"118 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123469738","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-07-26DOI: 10.1109/SSRR.2010.5981555
N. Sato, Takahiro Inagaki, F. Matsuno
In this paper, teleoperation system using past image records (SPIR) considering moving objects is proposed. SPIR virtually generates the scene looked from backward-tracking viewpoint by overlaying the CG model of the robot at the corresponding position on the background image which is got from the camera mounted on the robot at past time. However, moving objects cannot display correct positions, because the existing SPIR uses past static images as a background. In this study, to reflect moving objects, we improve three methods in the system. The first method is detection of moving objects. We install the particle filter algorithm in the system. The second method is elimination of moving objects on the background image. The third is overlaying moving objects at the correct position on the background image. Furthermore we have a verification experiment for the proposed system.
{"title":"Teleoperation system using past image records considering moving objects","authors":"N. Sato, Takahiro Inagaki, F. Matsuno","doi":"10.1109/SSRR.2010.5981555","DOIUrl":"https://doi.org/10.1109/SSRR.2010.5981555","url":null,"abstract":"In this paper, teleoperation system using past image records (SPIR) considering moving objects is proposed. SPIR virtually generates the scene looked from backward-tracking viewpoint by overlaying the CG model of the robot at the corresponding position on the background image which is got from the camera mounted on the robot at past time. However, moving objects cannot display correct positions, because the existing SPIR uses past static images as a background. In this study, to reflect moving objects, we improve three methods in the system. The first method is detection of moving objects. We install the particle filter algorithm in the system. The second method is elimination of moving objects on the background image. The third is overlaying moving objects at the correct position on the background image. Furthermore we have a verification experiment for the proposed system.","PeriodicalId":371261,"journal":{"name":"2010 IEEE Safety Security and Rescue Robotics","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116488239","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-07-26DOI: 10.1109/SSRR.2010.5981569
R. Voyles, Sam Povilus, R. Mangharam, Kang Li
Search and rescue robots can benefit from small size as it facilitates access to voids and movement in cramped quarters. Yet, small robots cannot be the entire answer for urban search and rescue because small size limits the size of actuators, sensor payloads, computational capacity and battery capacity. Nonetheless, we are attempting to alleviate these limitations by developing the hardware and software infrastructure for heterogeneous, wireless sensor/actuator/control networks that is well-suited to miniature search and rescue robots, as well as a host of other relevant applications. These networks of application-specific sensors, actuators and intelligence will be assembled from a backbone that includes scalable computing, a flexible I/O bus, and multi-hop data networking. But two things will ultimately give our wireless infrastructure its novelty: a dual-baseband communications layer and the embedded virtual machine. The dual-baseband communications layer augments the standard data communication layer with a secondary, sub-millisecond synchronization layer to permit high-fidelity, deterministic, distributed control across the network. The determinism of this dual-baseband communications layer, in turn, enables the creation of the embedded virtual machine, which is a programming construct that abstracts away the physical sensor/actuator/control nodes. With this infrastructure, programming will not be done at the node level, as in conventional wireless sensor networks. Instead, programming will be done at the task level with port-based objects distributed across physical resources as necessary at either compile-time or run-time. At compile-time, the system can assist in the specification of the physical network, while at run-time the system can react to changes in configuration, such as nodes exhausting their batteries or losing connectivity. This paper describes progress to-date on developing this scalable infrastructure, specifically the RecoNode high-performance, dynamically-reconfigurable computational node for the Terminator-Bot crawling robot and the FireFly mid-performance node, as well as their real-time software.
{"title":"RecoNode: A reconfigurable node for heterogeneous multi-robot search and rescue","authors":"R. Voyles, Sam Povilus, R. Mangharam, Kang Li","doi":"10.1109/SSRR.2010.5981569","DOIUrl":"https://doi.org/10.1109/SSRR.2010.5981569","url":null,"abstract":"Search and rescue robots can benefit from small size as it facilitates access to voids and movement in cramped quarters. Yet, small robots cannot be the entire answer for urban search and rescue because small size limits the size of actuators, sensor payloads, computational capacity and battery capacity. Nonetheless, we are attempting to alleviate these limitations by developing the hardware and software infrastructure for heterogeneous, wireless sensor/actuator/control networks that is well-suited to miniature search and rescue robots, as well as a host of other relevant applications. These networks of application-specific sensors, actuators and intelligence will be assembled from a backbone that includes scalable computing, a flexible I/O bus, and multi-hop data networking. But two things will ultimately give our wireless infrastructure its novelty: a dual-baseband communications layer and the embedded virtual machine. The dual-baseband communications layer augments the standard data communication layer with a secondary, sub-millisecond synchronization layer to permit high-fidelity, deterministic, distributed control across the network. The determinism of this dual-baseband communications layer, in turn, enables the creation of the embedded virtual machine, which is a programming construct that abstracts away the physical sensor/actuator/control nodes. With this infrastructure, programming will not be done at the node level, as in conventional wireless sensor networks. Instead, programming will be done at the task level with port-based objects distributed across physical resources as necessary at either compile-time or run-time. At compile-time, the system can assist in the specification of the physical network, while at run-time the system can react to changes in configuration, such as nodes exhausting their batteries or losing connectivity. This paper describes progress to-date on developing this scalable infrastructure, specifically the RecoNode high-performance, dynamically-reconfigurable computational node for the Terminator-Bot crawling robot and the FireFly mid-performance node, as well as their real-time software.","PeriodicalId":371261,"journal":{"name":"2010 IEEE Safety Security and Rescue Robotics","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114821610","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}