Arpit Agarwal, Achu Wilson, Timothy Man, Edward Adelson, Ioannis Gkioulekas, Wenzhen Yuan
{"title":"Vision-based tactile sensor design using physically based rendering.","authors":"Arpit Agarwal, Achu Wilson, Timothy Man, Edward Adelson, Ioannis Gkioulekas, Wenzhen Yuan","doi":"10.1038/s44172-025-00350-4","DOIUrl":null,"url":null,"abstract":"<p><p>High-resolution tactile sensors are very helpful to robots for fine-grained perception and manipulation tasks, but designing those sensors is challenging. This is because the designs are based on the compact integration of multiple optical elements, and it is difficult to understand the correlation between the element arrangements and the sensor accuracy by trial and error. In this work, we introduce the digital design of vision-based tactile sensors using a physically accurate light simulator. The framework modularizes the design process, parameterizes the sensor components, and contains an evaluation metric to quantify a sensor's performance. We quantify the effects of sensor shape, illumination setting, and sensing surface material on tactile sensor performance using our evaluation metric. The proposed optical simulation framework can replicate the tactile image of the real vision-based tactile sensor prototype without any prior sensor-specific data. Using our approach we can substantially improve the design of a fingertip GelSight sensor. This improved design performs approximately 5 times better than previous state-of-the-art human-expert design at real-world robotic tactile embossed text detection. Our simulation approach can be used with any vision-based tactile sensor to produce a physically accurate tactile image. Overall, our approach enables the automatic design of sensorized soft robots and opens the door for closed-loop co-optimization of controllers and sensors for dexterous manipulation.</p>","PeriodicalId":72644,"journal":{"name":"Communications engineering","volume":"4 1","pages":"21"},"PeriodicalIF":0.0000,"publicationDate":"2025-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11828998/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Communications engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1038/s44172-025-00350-4","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
High-resolution tactile sensors are very helpful to robots for fine-grained perception and manipulation tasks, but designing those sensors is challenging. This is because the designs are based on the compact integration of multiple optical elements, and it is difficult to understand the correlation between the element arrangements and the sensor accuracy by trial and error. In this work, we introduce the digital design of vision-based tactile sensors using a physically accurate light simulator. The framework modularizes the design process, parameterizes the sensor components, and contains an evaluation metric to quantify a sensor's performance. We quantify the effects of sensor shape, illumination setting, and sensing surface material on tactile sensor performance using our evaluation metric. The proposed optical simulation framework can replicate the tactile image of the real vision-based tactile sensor prototype without any prior sensor-specific data. Using our approach we can substantially improve the design of a fingertip GelSight sensor. This improved design performs approximately 5 times better than previous state-of-the-art human-expert design at real-world robotic tactile embossed text detection. Our simulation approach can be used with any vision-based tactile sensor to produce a physically accurate tactile image. Overall, our approach enables the automatic design of sensorized soft robots and opens the door for closed-loop co-optimization of controllers and sensors for dexterous manipulation.