{"title":"基于视觉触觉传感器的纹理识别和三维力测量","authors":"Xiaoyue Cao, Chunfang Liu, Xiaoli Li","doi":"10.1109/ICCSS53909.2021.9721952","DOIUrl":null,"url":null,"abstract":"Although robotic grippers have been extensively used in industry nowadays, most of them still are lack of tactile perception to achieve some dexterous manipulation like grasping an unknown object using appropriate force. Hence, to make the grippers gain multiple types of tactile information, we combine the gripper with the dual-modal vision-based tactile sensor in our experiment. Different from existed texture recognition experiments, we build own texture dataset included 12 kinds of samples using the novel tactile transducer. At the same time, we compare K-Nearest Neighbor (KNN) with Residual Network (ResNet), the experiment results showcase that the accuracy of KNN, is only 66.11%, while the accuracy of ResNet based on deep convolution neural network is as high as 100.00%. In addition, to detect the contact force, we employ the nonlinear characteristic of BP neural network to establish the mapping relation between the two-dimensional displacement image of markers and the three-dimensional (3D) force vector. Experiments are implemented to demonstrate the sensor’s performance of predicting the force within 4% margin of error.","PeriodicalId":435816,"journal":{"name":"2021 8th International Conference on Information, Cybernetics, and Computational Social Systems (ICCSS)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Texture Recognition and Three-Dimensional Force Measurement Using Vision-based Tactile Sensor\",\"authors\":\"Xiaoyue Cao, Chunfang Liu, Xiaoli Li\",\"doi\":\"10.1109/ICCSS53909.2021.9721952\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Although robotic grippers have been extensively used in industry nowadays, most of them still are lack of tactile perception to achieve some dexterous manipulation like grasping an unknown object using appropriate force. Hence, to make the grippers gain multiple types of tactile information, we combine the gripper with the dual-modal vision-based tactile sensor in our experiment. Different from existed texture recognition experiments, we build own texture dataset included 12 kinds of samples using the novel tactile transducer. At the same time, we compare K-Nearest Neighbor (KNN) with Residual Network (ResNet), the experiment results showcase that the accuracy of KNN, is only 66.11%, while the accuracy of ResNet based on deep convolution neural network is as high as 100.00%. In addition, to detect the contact force, we employ the nonlinear characteristic of BP neural network to establish the mapping relation between the two-dimensional displacement image of markers and the three-dimensional (3D) force vector. Experiments are implemented to demonstrate the sensor’s performance of predicting the force within 4% margin of error.\",\"PeriodicalId\":435816,\"journal\":{\"name\":\"2021 8th International Conference on Information, Cybernetics, and Computational Social Systems (ICCSS)\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 8th International Conference on Information, Cybernetics, and Computational Social Systems (ICCSS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCSS53909.2021.9721952\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 8th International Conference on Information, Cybernetics, and Computational Social Systems (ICCSS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCSS53909.2021.9721952","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Texture Recognition and Three-Dimensional Force Measurement Using Vision-based Tactile Sensor
Although robotic grippers have been extensively used in industry nowadays, most of them still are lack of tactile perception to achieve some dexterous manipulation like grasping an unknown object using appropriate force. Hence, to make the grippers gain multiple types of tactile information, we combine the gripper with the dual-modal vision-based tactile sensor in our experiment. Different from existed texture recognition experiments, we build own texture dataset included 12 kinds of samples using the novel tactile transducer. At the same time, we compare K-Nearest Neighbor (KNN) with Residual Network (ResNet), the experiment results showcase that the accuracy of KNN, is only 66.11%, while the accuracy of ResNet based on deep convolution neural network is as high as 100.00%. In addition, to detect the contact force, we employ the nonlinear characteristic of BP neural network to establish the mapping relation between the two-dimensional displacement image of markers and the three-dimensional (3D) force vector. Experiments are implemented to demonstrate the sensor’s performance of predicting the force within 4% margin of error.