{"title":"E-BTS: Event-Based Tactile Sensor for Haptic Teleoperation in Augmented Reality","authors":"Dinmukhammed Mukashev;Saltanat Seitzhan;Jabrail Chumakov;Soibkhon Khajikhanov;Madina Yergibay;Nurlan Zhaniyar;Rustam Chibar;Ayan Mazhitov;Matteo Rubagotti;Zhanat Kappassov","doi":"10.1109/TRO.2024.3502215","DOIUrl":null,"url":null,"abstract":"The prompt and robust detection of tactile information is a relevant and challenging problem, and a considerable research effort is, thus, being put into innovative transduction methods for tactile sensors. In this article, we investigate the possibility of using event-based cameras to sense contact forces applied to objects by a robot end effector. The proposed optical tactile sensor incorporates a soft hemispherical pad made of silicone rubber with imprinted markers and a pulsewidth modulation light source that emits optical pulses, allowing robust detection of the markers to track deformations of the pad. To test the effectiveness of our sensor, experiments were carried out attaching it to a teleoperated robot arm to finely control it when out of the user's field of view, as accurately as if the user could see it. In the experiments, an augmented reality display and a haptic device were used to convey the force detected by the event-based tactile sensor back to the human operator. The experiments included a practical application of a soft tissue puncturing tool, and psychophysical test results from ten participants were recorded, to validate efficacy and usability of the system.","PeriodicalId":50388,"journal":{"name":"IEEE Transactions on Robotics","volume":"41 ","pages":"450-463"},"PeriodicalIF":10.5000,"publicationDate":"2024-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Robotics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10758224/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
The prompt and robust detection of tactile information is a relevant and challenging problem, and a considerable research effort is, thus, being put into innovative transduction methods for tactile sensors. In this article, we investigate the possibility of using event-based cameras to sense contact forces applied to objects by a robot end effector. The proposed optical tactile sensor incorporates a soft hemispherical pad made of silicone rubber with imprinted markers and a pulsewidth modulation light source that emits optical pulses, allowing robust detection of the markers to track deformations of the pad. To test the effectiveness of our sensor, experiments were carried out attaching it to a teleoperated robot arm to finely control it when out of the user's field of view, as accurately as if the user could see it. In the experiments, an augmented reality display and a haptic device were used to convey the force detected by the event-based tactile sensor back to the human operator. The experiments included a practical application of a soft tissue puncturing tool, and psychophysical test results from ten participants were recorded, to validate efficacy and usability of the system.
期刊介绍:
The IEEE Transactions on Robotics (T-RO) is dedicated to publishing fundamental papers covering all facets of robotics, drawing on interdisciplinary approaches from computer science, control systems, electrical engineering, mathematics, mechanical engineering, and beyond. From industrial applications to service and personal assistants, surgical operations to space, underwater, and remote exploration, robots and intelligent machines play pivotal roles across various domains, including entertainment, safety, search and rescue, military applications, agriculture, and intelligent vehicles.
Special emphasis is placed on intelligent machines and systems designed for unstructured environments, where a significant portion of the environment remains unknown and beyond direct sensing or control.