D. Balamurugan, Andrei Nakagawa Silva, Harrison H. Nguyen, J. Low, Christopher Shallal, Luke E. Osborn, A. Soares, R. C. Yeow, N. Thakor
{"title":"Texture Discrimination using a Soft Biomimetic Finger for Prosthetic Applications","authors":"D. Balamurugan, Andrei Nakagawa Silva, Harrison H. Nguyen, J. Low, Christopher Shallal, Luke E. Osborn, A. Soares, R. C. Yeow, N. Thakor","doi":"10.1109/ICORR.2019.8779442","DOIUrl":null,"url":null,"abstract":"Soft robotic fingers have shown great potential for use in prostheses due to their inherent compliant, light, and dexterous nature. Recent advancements in sensor technology for soft robotic systems showcase their ability to perceive and respond to static cues. However, most of the soft fingers for use in prosthetic applications are not equipped with sensors which have the ability to perceive texture like humans can. In this work, we present a dexterous, soft, biomimetic solution which is capable of discrimination of textures. We fabricated a soft finger with two individually controllable degrees of freedom with a tactile sensor embedded at the fingertip. The output of the tac- tile sensor, as texture plates were palpated, was converted into spikes, mimicking the behavior of a biological mechanoreceptor. We explored the spatial properties of the textures captured in the form of spiking patterns by generating spatial event plots and analyzing the similarity between spike trains generated for each texture. Unique features representative of the different textures were then extracted from the spikes and input to a classifier. The textures were successfully classified with an accuracy of 94% when palpating at a rate of 42 mm/s. This work demonstrates the potential of providing amputees with a soft finger with sensing capabilities, which could potentially help discriminate between different objects and surfaces during activities of daily living (ADL) through palpation.","PeriodicalId":130415,"journal":{"name":"2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR)","volume":"82 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICORR.2019.8779442","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Soft robotic fingers have shown great potential for use in prostheses due to their inherent compliant, light, and dexterous nature. Recent advancements in sensor technology for soft robotic systems showcase their ability to perceive and respond to static cues. However, most of the soft fingers for use in prosthetic applications are not equipped with sensors which have the ability to perceive texture like humans can. In this work, we present a dexterous, soft, biomimetic solution which is capable of discrimination of textures. We fabricated a soft finger with two individually controllable degrees of freedom with a tactile sensor embedded at the fingertip. The output of the tac- tile sensor, as texture plates were palpated, was converted into spikes, mimicking the behavior of a biological mechanoreceptor. We explored the spatial properties of the textures captured in the form of spiking patterns by generating spatial event plots and analyzing the similarity between spike trains generated for each texture. Unique features representative of the different textures were then extracted from the spikes and input to a classifier. The textures were successfully classified with an accuracy of 94% when palpating at a rate of 42 mm/s. This work demonstrates the potential of providing amputees with a soft finger with sensing capabilities, which could potentially help discriminate between different objects and surfaces during activities of daily living (ADL) through palpation.