Qingxiaoyang Zhu, Vittorio Perera, Mirko Wächter, T. Asfour, M. Veloso
{"title":"Autonomous narration of humanoid robot kitchen task experience","authors":"Qingxiaoyang Zhu, Vittorio Perera, Mirko Wächter, T. Asfour, M. Veloso","doi":"10.1109/HUMANOIDS.2017.8246903","DOIUrl":null,"url":null,"abstract":"The progress in humanoid robotics research has led to robots that are able to perform complex tasks with a certain level of autonomy by integrating perception, action, planning, and learning capabilities. However, robot capabilities are still limited in regard to how they externalize their internal state and world state, i.e. their sensorimotor experience, and how they explain which tasks they performed and how they performed these tasks. In other words, their capability in conveying information to the user in a way similar to what humans do is limited. To this end, we present a verbalization system that generates natural language explanations of the robot's past navigation and manipulation experience. We propose a threelayered model to represent robot experience which doubles as a retrievable episodic memory. Through the memory system, the robot can select a matching experience given a user query. In order to generate flexible narrations, we use verbalization parameters to capture user preferences. We show that our verbalization algorithm is capable of producing appropriate results based on these verbalization parameters. The proposed verbalization system is able to generate explanations for navigation as well as grasping and manipulation tasks. The resulting system is evaluated in a pick-and-place kitchen scenario.","PeriodicalId":143992,"journal":{"name":"2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HUMANOIDS.2017.8246903","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11
Abstract
The progress in humanoid robotics research has led to robots that are able to perform complex tasks with a certain level of autonomy by integrating perception, action, planning, and learning capabilities. However, robot capabilities are still limited in regard to how they externalize their internal state and world state, i.e. their sensorimotor experience, and how they explain which tasks they performed and how they performed these tasks. In other words, their capability in conveying information to the user in a way similar to what humans do is limited. To this end, we present a verbalization system that generates natural language explanations of the robot's past navigation and manipulation experience. We propose a threelayered model to represent robot experience which doubles as a retrievable episodic memory. Through the memory system, the robot can select a matching experience given a user query. In order to generate flexible narrations, we use verbalization parameters to capture user preferences. We show that our verbalization algorithm is capable of producing appropriate results based on these verbalization parameters. The proposed verbalization system is able to generate explanations for navigation as well as grasping and manipulation tasks. The resulting system is evaluated in a pick-and-place kitchen scenario.