Kelin Li, Shubham M Wagh, Nitish Sharma, Saksham Bhadani, Wei Chen, Chang Liu, Petar Kormushev
{"title":"触觉-ACT:通过沉浸式虚拟现实技术将人类直觉与顺应性机器人操纵结合起来","authors":"Kelin Li, Shubham M Wagh, Nitish Sharma, Saksham Bhadani, Wei Chen, Chang Liu, Petar Kormushev","doi":"arxiv-2409.11925","DOIUrl":null,"url":null,"abstract":"Robotic manipulation is essential for the widespread adoption of robots in\nindustrial and home settings and has long been a focus within the robotics\ncommunity. Advances in artificial intelligence have introduced promising\nlearning-based methods to address this challenge, with imitation learning\nemerging as particularly effective. However, efficiently acquiring high-quality\ndemonstrations remains a challenge. In this work, we introduce an immersive\nVR-based teleoperation setup designed to collect demonstrations from a remote\nhuman user. We also propose an imitation learning framework called Haptic\nAction Chunking with Transformers (Haptic-ACT). To evaluate the platform, we\nconducted a pick-and-place task and collected 50 demonstration episodes.\nResults indicate that the immersive VR platform significantly reduces\ndemonstrator fingertip forces compared to systems without haptic feedback,\nenabling more delicate manipulation. Additionally, evaluations of the\nHaptic-ACT framework in both the MuJoCo simulator and on a real robot\ndemonstrate its effectiveness in teaching robots more compliant manipulation\ncompared to the original ACT. Additional materials are available at\nhttps://sites.google.com/view/hapticact.","PeriodicalId":501031,"journal":{"name":"arXiv - CS - Robotics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Haptic-ACT: Bridging Human Intuition with Compliant Robotic Manipulation via Immersive VR\",\"authors\":\"Kelin Li, Shubham M Wagh, Nitish Sharma, Saksham Bhadani, Wei Chen, Chang Liu, Petar Kormushev\",\"doi\":\"arxiv-2409.11925\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Robotic manipulation is essential for the widespread adoption of robots in\\nindustrial and home settings and has long been a focus within the robotics\\ncommunity. Advances in artificial intelligence have introduced promising\\nlearning-based methods to address this challenge, with imitation learning\\nemerging as particularly effective. However, efficiently acquiring high-quality\\ndemonstrations remains a challenge. In this work, we introduce an immersive\\nVR-based teleoperation setup designed to collect demonstrations from a remote\\nhuman user. We also propose an imitation learning framework called Haptic\\nAction Chunking with Transformers (Haptic-ACT). To evaluate the platform, we\\nconducted a pick-and-place task and collected 50 demonstration episodes.\\nResults indicate that the immersive VR platform significantly reduces\\ndemonstrator fingertip forces compared to systems without haptic feedback,\\nenabling more delicate manipulation. Additionally, evaluations of the\\nHaptic-ACT framework in both the MuJoCo simulator and on a real robot\\ndemonstrate its effectiveness in teaching robots more compliant manipulation\\ncompared to the original ACT. Additional materials are available at\\nhttps://sites.google.com/view/hapticact.\",\"PeriodicalId\":501031,\"journal\":{\"name\":\"arXiv - CS - Robotics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Robotics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.11925\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11925","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Haptic-ACT: Bridging Human Intuition with Compliant Robotic Manipulation via Immersive VR
Robotic manipulation is essential for the widespread adoption of robots in
industrial and home settings and has long been a focus within the robotics
community. Advances in artificial intelligence have introduced promising
learning-based methods to address this challenge, with imitation learning
emerging as particularly effective. However, efficiently acquiring high-quality
demonstrations remains a challenge. In this work, we introduce an immersive
VR-based teleoperation setup designed to collect demonstrations from a remote
human user. We also propose an imitation learning framework called Haptic
Action Chunking with Transformers (Haptic-ACT). To evaluate the platform, we
conducted a pick-and-place task and collected 50 demonstration episodes.
Results indicate that the immersive VR platform significantly reduces
demonstrator fingertip forces compared to systems without haptic feedback,
enabling more delicate manipulation. Additionally, evaluations of the
Haptic-ACT framework in both the MuJoCo simulator and on a real robot
demonstrate its effectiveness in teaching robots more compliant manipulation
compared to the original ACT. Additional materials are available at
https://sites.google.com/view/hapticact.