Pub Date : 2024-09-27DOI: 10.1109/TOH.2024.3449411
Peiliang Wu, Haozhe Zhang, Yao Li, Wenbai Chen, Guowei Gao
Current issues with neuromorphic visual-tactile perception include limited training network representation and inadequate cross-modal fusion. To address these two issues, we proposed a dual network called visual-tactile spiking graph neural network (VT-SGN) that combines graph neural networks and spiking neural networks to jointly utilize the neuromorphic visual and tactile source data. First, the neuromorphic visual-tactile data were expanded spatiotemporally to create a taxel-based tactile graph in the spatial domain, enabling the complete exploitation of the irregular spatial structure properties of tactile information. Subsequently, a method for converting images into graph structures was proposed, allowing the vision to be trained alongside graph neural networks and extracting graph-level features from the vision for fusion with tactile data. Finally, the data were expanded into the time domain using a spiking neural network to train the model and propagate it backwards. This framework effectively utilizes the structural differences between sample instances in the spatial dimension to improve the representational power of spiking neurons, while preserving the biodynamic mechanism of the spiking neural network. Additionally, it effectively solves the morphological variance between the two perceptions and further uses complementary data between visual and tactile. To demonstrate that our approach can improve the learning of neuromorphic perceptual information, we conducted comprehensive comparative experiments on three datasets to validate the benefits of the proposed VT-SGN framework by comparing it with state-of-the-art studies.
{"title":"VT-SGN:Spiking Graph Neural Network for Neuromorphic Visual-Tactile Fusion.","authors":"Peiliang Wu, Haozhe Zhang, Yao Li, Wenbai Chen, Guowei Gao","doi":"10.1109/TOH.2024.3449411","DOIUrl":"https://doi.org/10.1109/TOH.2024.3449411","url":null,"abstract":"<p><p>Current issues with neuromorphic visual-tactile perception include limited training network representation and inadequate cross-modal fusion. To address these two issues, we proposed a dual network called visual-tactile spiking graph neural network (VT-SGN) that combines graph neural networks and spiking neural networks to jointly utilize the neuromorphic visual and tactile source data. First, the neuromorphic visual-tactile data were expanded spatiotemporally to create a taxel-based tactile graph in the spatial domain, enabling the complete exploitation of the irregular spatial structure properties of tactile information. Subsequently, a method for converting images into graph structures was proposed, allowing the vision to be trained alongside graph neural networks and extracting graph-level features from the vision for fusion with tactile data. Finally, the data were expanded into the time domain using a spiking neural network to train the model and propagate it backwards. This framework effectively utilizes the structural differences between sample instances in the spatial dimension to improve the representational power of spiking neurons, while preserving the biodynamic mechanism of the spiking neural network. Additionally, it effectively solves the morphological variance between the two perceptions and further uses complementary data between visual and tactile. To demonstrate that our approach can improve the learning of neuromorphic perceptual information, we conducted comprehensive comparative experiments on three datasets to validate the benefits of the proposed VT-SGN framework by comparing it with state-of-the-art studies.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2024-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142345811","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Numerous studies have indicated that the use of a closed-loop haptic feedback system, which offers various mechano-tactile stimuli patterns with different actuation methods, can improve the performance and grasp control of prosthetic hands. Purely mechanical-driven feedback approaches for various mechano-tactile stimuli patterns, however, have not been explored. In this paper, a multi-cavity fluidic haptic feedback system is introduced with details of design, fabrication, and validation. The multi-cavity haptic feedback system can detect the physical touch with direction at the fingertip sensor. The direction of the force is reflected in the form of pressure deviation in the multi-cavity fingertip sensor. The feedback actuator generates various mechano-tactile stimuli patterns according to the pressure deviation from the fingertip sensor. Hence, users can identify the force with direction according to the stimuli patterns. The haptic feedback system is validated through two experiments. The initial experiment characterises the system and establishes the relationship between the fingertip sensor and feedback actuator. The subsequent experiment, a human interaction test, confirms the system's capability to detect force with directions and generate corresponding tactile stimuli in the feedback actuator. The outcomes corroborate the idea that participants are generally capable of discerning changes in angle.
{"title":"Design and Characterisation of Multi-cavity, Fluidic Haptic Feedback System for Mechano-tactile Feedback.","authors":"Ge Shi, Jialei Shi, Azadeh Shariati, Kamyar Motaghedolhagh, Shervanthi Homer-Vanniasinkam, Helge Wurdemann","doi":"10.1109/TOH.2024.3454179","DOIUrl":"https://doi.org/10.1109/TOH.2024.3454179","url":null,"abstract":"<p><p>Numerous studies have indicated that the use of a closed-loop haptic feedback system, which offers various mechano-tactile stimuli patterns with different actuation methods, can improve the performance and grasp control of prosthetic hands. Purely mechanical-driven feedback approaches for various mechano-tactile stimuli patterns, however, have not been explored. In this paper, a multi-cavity fluidic haptic feedback system is introduced with details of design, fabrication, and validation. The multi-cavity haptic feedback system can detect the physical touch with direction at the fingertip sensor. The direction of the force is reflected in the form of pressure deviation in the multi-cavity fingertip sensor. The feedback actuator generates various mechano-tactile stimuli patterns according to the pressure deviation from the fingertip sensor. Hence, users can identify the force with direction according to the stimuli patterns. The haptic feedback system is validated through two experiments. The initial experiment characterises the system and establishes the relationship between the fingertip sensor and feedback actuator. The subsequent experiment, a human interaction test, confirms the system's capability to detect force with directions and generate corresponding tactile stimuli in the feedback actuator. The outcomes corroborate the idea that participants are generally capable of discerning changes in angle.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142125600","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-03DOI: 10.1109/TOH.2024.3452102
Takuya Jodai, Lynette A Jones, Masahiko Terao, Hsin-Ni Ho
In cutaneous displays in which both tactile and thermal signals are presented, it is important to understand the temporal requirements associated with presenting these signals so that they are perceptually synchronous. Such synchrony is important to provide realistic touch experiences in applications involving object recognition and social touch interactions. In the present experiment the temporal window within which tactile and warm thermal stimuli are perceived to occur at the same time was determined. A Simultaneity Judgment Task was used in which pairs of tactile and thermal stimuli were presented on the hand at varying stimulus onset asynchronies, and participants determined whether the stimuli were simultaneous or not. The results indicated that the average simultaneity window width was 1041 ms. The average point of subjective simultaneity (PSS) was -569 ms, indicating that participants perceived simultaneity best when the warm thermal stimulus preceded the tactile stimulus by 569 ms. These findings indicate that thermal and tactile stimuli do not need to be displayed simultaneously for the two stimuli to be perceived as being synchronous and therefore the timing of such stimuli can be adjusted to maximize the likelihood that they will both be perceived.
{"title":"Perceiving Synchrony: Determining Thermal-tactile Simultaneity Windows.","authors":"Takuya Jodai, Lynette A Jones, Masahiko Terao, Hsin-Ni Ho","doi":"10.1109/TOH.2024.3452102","DOIUrl":"https://doi.org/10.1109/TOH.2024.3452102","url":null,"abstract":"<p><p>In cutaneous displays in which both tactile and thermal signals are presented, it is important to understand the temporal requirements associated with presenting these signals so that they are perceptually synchronous. Such synchrony is important to provide realistic touch experiences in applications involving object recognition and social touch interactions. In the present experiment the temporal window within which tactile and warm thermal stimuli are perceived to occur at the same time was determined. A Simultaneity Judgment Task was used in which pairs of tactile and thermal stimuli were presented on the hand at varying stimulus onset asynchronies, and participants determined whether the stimuli were simultaneous or not. The results indicated that the average simultaneity window width was 1041 ms. The average point of subjective simultaneity (PSS) was -569 ms, indicating that participants perceived simultaneity best when the warm thermal stimulus preceded the tactile stimulus by 569 ms. These findings indicate that thermal and tactile stimuli do not need to be displayed simultaneously for the two stimuli to be perceived as being synchronous and therefore the timing of such stimuli can be adjusted to maximize the likelihood that they will both be perceived.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142125601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-03DOI: 10.1109/TOH.2024.3453894
Ruben Martin-Rodriguez, Alexandre L Ratschat, Laura Marchal-Crespo, Yasemin Vardar
Haptic rendering of weight plays an essential role in naturalistic object interaction in virtual environments. While kinesthetic devices have traditionally been used for this aim by applying forces on the limbs, tactile interfaces acting on the skin have recently offered potential solutions to enhance or substitute kinesthetic ones. Here, we aim to provide an in-depth overview and comparison of existing tactile weight rendering approaches. We categorized these approaches based on their type of stimulation into asymmetric vibration and skin stretch, further divided according to the working mechanism of the devices. Then, we compared these approaches using various criteria, including physical, mechanical, and perceptual characteristics of the reported devices. We found that asymmetric vibration devices have the smallest form factor, while skin stretch devices relying on the motion of flat surfaces, belts, or tactors present numerous mechanical and perceptual advantages for scenarios requiring more accurate weight rendering. Finally, we discussed the selection of the proposed categorization of devices together with the limitations and opportunities for future research. We hope this study guides the development and use of tactile interfaces to achieve a more naturalistic object interaction and manipulation in virtual environments.
{"title":"Tactile Weight Rendering: A Review for Researchers and Developers.","authors":"Ruben Martin-Rodriguez, Alexandre L Ratschat, Laura Marchal-Crespo, Yasemin Vardar","doi":"10.1109/TOH.2024.3453894","DOIUrl":"https://doi.org/10.1109/TOH.2024.3453894","url":null,"abstract":"<p><p>Haptic rendering of weight plays an essential role in naturalistic object interaction in virtual environments. While kinesthetic devices have traditionally been used for this aim by applying forces on the limbs, tactile interfaces acting on the skin have recently offered potential solutions to enhance or substitute kinesthetic ones. Here, we aim to provide an in-depth overview and comparison of existing tactile weight rendering approaches. We categorized these approaches based on their type of stimulation into asymmetric vibration and skin stretch, further divided according to the working mechanism of the devices. Then, we compared these approaches using various criteria, including physical, mechanical, and perceptual characteristics of the reported devices. We found that asymmetric vibration devices have the smallest form factor, while skin stretch devices relying on the motion of flat surfaces, belts, or tactors present numerous mechanical and perceptual advantages for scenarios requiring more accurate weight rendering. Finally, we discussed the selection of the proposed categorization of devices together with the limitations and opportunities for future research. We hope this study guides the development and use of tactile interfaces to achieve a more naturalistic object interaction and manipulation in virtual environments.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142125602","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-15DOI: 10.1109/TOH.2024.3441670
Easa AliAbbasi, Muhammad Muzammil, Omer Sirin, Philippe Lefevre, Orjan Grottem Martinsen, Cagatay Basdogan
We investigate the effect of finger moisture on the tactile perception of electroadhesion with 10 participants. Participants with moist fingers exhibited markedly higher threshold levels. Our electrical impedance measurements show a substantial reduction in impedance magnitude when sweat is present at the finger-touchscreen interface, indicating increased conductivity. Supporting this, our mechanical friction measurements show that the relative increase in electrostatic force due to electroadhesion is lower for a moist finger.
{"title":"Effect of Finger Moisture on Tactile Perception of Electroadhesion.","authors":"Easa AliAbbasi, Muhammad Muzammil, Omer Sirin, Philippe Lefevre, Orjan Grottem Martinsen, Cagatay Basdogan","doi":"10.1109/TOH.2024.3441670","DOIUrl":"https://doi.org/10.1109/TOH.2024.3441670","url":null,"abstract":"<p><p>We investigate the effect of finger moisture on the tactile perception of electroadhesion with 10 participants. Participants with moist fingers exhibited markedly higher threshold levels. Our electrical impedance measurements show a substantial reduction in impedance magnitude when sweat is present at the finger-touchscreen interface, indicating increased conductivity. Supporting this, our mechanical friction measurements show that the relative increase in electrostatic force due to electroadhesion is lower for a moist finger.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2024-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141987852","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-15DOI: 10.1109/TOH.2024.3444491
Thomas Daunizeau, Sinan Haliyo, Vincent Hayward
Humans rely on multimodal perception to form representations of the world. This implies that environmental stimuli must remain consistent and predictable throughout their journey to our sensory organs. When it comes to vision, electromagnetic waves are minimally affected when passing through air or glass treated for chromatic aberrations. Similar conclusions can be drawn for hearing and acoustic waves. However, tools that propagate elastic waves to our cutaneous afferents tend to color tactual perception due to parasitic mechanical attributes such as resonances and inertia. These issues are often overlooked, despite their critical importance for haptic devices that aim to faithfully render or record tactile interactions. Here, we investigate how to optimize this mechanical transmission with sandwich structures made from rigid, lightweight carbon fiber sheets arranged around a 3D-printed lattice core. Through a comprehensive parametric evaluation, we demonstrate how this design paradigm provides superior haptic transparency, regardless of the lattice types. Drawing an analogy with topology optimization, our solution approaches a foreseeable technological limit. It offers a practical way to create high-fidelity haptic interfaces, opening new avenues for research on tool-mediated interactions.
人类依靠多模态感知来形成对世界的表征。这意味着环境刺激在到达我们感觉器官的整个过程中必须保持一致和可预测。就视觉而言,电磁波在通过空气或经过色差处理的玻璃时,受到的影响微乎其微。听觉和声波也可以得出类似的结论。然而,由于共振和惯性等寄生机械属性,向我们的皮肤传入器官传播弹性波的工具往往会影响触觉感知。尽管这些问题对于旨在忠实呈现或记录触觉互动的触觉设备至关重要,但却经常被忽视。在这里,我们研究了如何利用由围绕 3D 打印晶格核心排列的刚性轻质碳纤维片制成的三明治结构来优化这种机械传动。通过全面的参数评估,我们展示了这种设计范式如何提供卓越的触觉透明度,而不受晶格类型的影响。类比拓扑优化,我们的解决方案接近可预见的技术极限。它为创建高保真触觉界面提供了一种切实可行的方法,为以工具为媒介的交互研究开辟了新的途径。
{"title":"Optimized Sandwich and Topological Structures for Enhanced Haptic Transparency.","authors":"Thomas Daunizeau, Sinan Haliyo, Vincent Hayward","doi":"10.1109/TOH.2024.3444491","DOIUrl":"https://doi.org/10.1109/TOH.2024.3444491","url":null,"abstract":"<p><p>Humans rely on multimodal perception to form representations of the world. This implies that environmental stimuli must remain consistent and predictable throughout their journey to our sensory organs. When it comes to vision, electromagnetic waves are minimally affected when passing through air or glass treated for chromatic aberrations. Similar conclusions can be drawn for hearing and acoustic waves. However, tools that propagate elastic waves to our cutaneous afferents tend to color tactual perception due to parasitic mechanical attributes such as resonances and inertia. These issues are often overlooked, despite their critical importance for haptic devices that aim to faithfully render or record tactile interactions. Here, we investigate how to optimize this mechanical transmission with sandwich structures made from rigid, lightweight carbon fiber sheets arranged around a 3D-printed lattice core. Through a comprehensive parametric evaluation, we demonstrate how this design paradigm provides superior haptic transparency, regardless of the lattice types. Drawing an analogy with topology optimization, our solution approaches a foreseeable technological limit. It offers a practical way to create high-fidelity haptic interfaces, opening new avenues for research on tool-mediated interactions.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2024-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141987853","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-02DOI: 10.1109/TOH.2024.3437766
Zane A Zook, Odnan Galvan, Ozioma Ozor-Ilo, Emre Selcuk, Marcia K O'Malley
Wearable haptic devices provide touch feedback to users for applications including virtual reality, prosthetics, and navigation. When these devices are designed for experimental validation in research settings, they are often highly specialized and customized to the specific application being studied. As such, it can be difficult to replicate device hardware due to the associated high costs of customized components and the complexity of their design and construction. In this work, we present Snaptics, a simple and modular platform designed for rapid prototyping of fully wearable multi-sensory haptic devices using 3D-printed modules and inexpensive off-the-shelf components accessible to the average hobbyist. We demonstrate the versatility of the modular system and the salience of haptic cues produced by wearables constructed with Snaptics modules in two human subject experiments. First, we report on the identification accuracy of multi-sensory haptic cues delivered by a Snaptics device. Second, we compare the effectiveness of the Snaptics Vibrotactile Bracelet to the Syntacts Bracelet, a high-fidelity wearable vibration feedback bracelet, in assisting participants with a virtual reality sorting task. Results indicate that participant performance was comparable in perceiving cue sets and in completing tasks when interacting with low-cost Snaptics devices as compared to a similar research-grade haptic wearables.
{"title":"Validation of Snaptics: A Modular Approach to Low-Cost Wearable Multi-Sensory Haptics.","authors":"Zane A Zook, Odnan Galvan, Ozioma Ozor-Ilo, Emre Selcuk, Marcia K O'Malley","doi":"10.1109/TOH.2024.3437766","DOIUrl":"10.1109/TOH.2024.3437766","url":null,"abstract":"<p><p>Wearable haptic devices provide touch feedback to users for applications including virtual reality, prosthetics, and navigation. When these devices are designed for experimental validation in research settings, they are often highly specialized and customized to the specific application being studied. As such, it can be difficult to replicate device hardware due to the associated high costs of customized components and the complexity of their design and construction. In this work, we present Snaptics, a simple and modular platform designed for rapid prototyping of fully wearable multi-sensory haptic devices using 3D-printed modules and inexpensive off-the-shelf components accessible to the average hobbyist. We demonstrate the versatility of the modular system and the salience of haptic cues produced by wearables constructed with Snaptics modules in two human subject experiments. First, we report on the identification accuracy of multi-sensory haptic cues delivered by a Snaptics device. Second, we compare the effectiveness of the Snaptics Vibrotactile Bracelet to the Syntacts Bracelet, a high-fidelity wearable vibration feedback bracelet, in assisting participants with a virtual reality sorting task. Results indicate that participant performance was comparable in perceiving cue sets and in completing tasks when interacting with low-cost Snaptics devices as compared to a similar research-grade haptic wearables.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141878643","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-02DOI: 10.1109/TOH.2024.3436827
Alessia S Ivani, Manuel G Catalano, Giorgio Grioli, Matteo Bianchi, Yon Visell, Antonio Bicchi
Tactile feedback is essential for upper-limb prostheses functionality and embodiment, yet its practical implementation presents challenges. Users must adapt to non-physiological signals, increasing cognitive load. However, some prosthetic devices transmit tactile information through socket vibrations, even to untrained individuals. Our experiments validated this observation, demonstrating a user's surprising ability to identify contacted fingers with a purely passive, cosmetic hand. Further experiments with advanced soft articulated hands revealed decreased performance in tactile information relayed by socket vibrations as hand complexity increased. To understand the underlying mechanisms, we conducted numerical and mechanical vibration tests on four prostheses of varying complexity. Additionally, a machine-learning classifier identified the contacted finger based on measured socket signals. Quantitative results confirmed that rigid hands facilitated contact discrimination, achieving 83% accuracy in distinguishing index finger contacts from others. While human discrimination decreased with advanced hands, machine learning surpassed human performance. These findings suggest that rigid prostheses provide natural vibration transmission, potentially reducing the need for tactile feedback devices, which advanced hands may require. Nonetheless, the possibility of machine learning algorithms outperforming human discrimination indicates potential to enhance socket vibrations through active sensing and actuation, bridging the gap in vibration-transmitted tactile discrimination between rigid and advanced hands.
{"title":"Tactile Perception in Upper Limb Prostheses: Mechanical Characterization, Human Experiments, and Computational Findings.","authors":"Alessia S Ivani, Manuel G Catalano, Giorgio Grioli, Matteo Bianchi, Yon Visell, Antonio Bicchi","doi":"10.1109/TOH.2024.3436827","DOIUrl":"https://doi.org/10.1109/TOH.2024.3436827","url":null,"abstract":"<p><p>Tactile feedback is essential for upper-limb prostheses functionality and embodiment, yet its practical implementation presents challenges. Users must adapt to non-physiological signals, increasing cognitive load. However, some prosthetic devices transmit tactile information through socket vibrations, even to untrained individuals. Our experiments validated this observation, demonstrating a user's surprising ability to identify contacted fingers with a purely passive, cosmetic hand. Further experiments with advanced soft articulated hands revealed decreased performance in tactile information relayed by socket vibrations as hand complexity increased. To understand the underlying mechanisms, we conducted numerical and mechanical vibration tests on four prostheses of varying complexity. Additionally, a machine-learning classifier identified the contacted finger based on measured socket signals. Quantitative results confirmed that rigid hands facilitated contact discrimination, achieving 83% accuracy in distinguishing index finger contacts from others. While human discrimination decreased with advanced hands, machine learning surpassed human performance. These findings suggest that rigid prostheses provide natural vibration transmission, potentially reducing the need for tactile feedback devices, which advanced hands may require. Nonetheless, the possibility of machine learning algorithms outperforming human discrimination indicates potential to enhance socket vibrations through active sensing and actuation, bridging the gap in vibration-transmitted tactile discrimination between rigid and advanced hands.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141878642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-30DOI: 10.1109/TOH.2024.3435588
Alessia Silvia Ivani, Federica Barontini, Manuel G Catalano, Giorgio Grioli, Matteo Bianchi, Antonio Bicchi
This study presents the characterization and validation of the VIBES, a wearable vibrotactile device that provides high-frequency tactile information embedded in a prosthetic socket. A psychophysical characterization involving ten able-bodied participants is performed to compute the Just Noticeable Difference (JND) related to the discrimination of vibrotactile cues delivered on the skin in two forearm positions, with the goal of optimising vibrotactile actuator position to maximise perceptual response. Furthermore, system performance is validated and tested both with ten able-bodied participants and one prosthesis user considering three tasks. More specifically, in the Active Texture Identification, Slippage and Fragile Object Experiments, we investigate if the VIBES could enhance users' roughness discrimination and manual usability and dexterity. Finally, we test the effect of the vibrotactile system on prosthetic embodiment in a Rubber Hand Illusion (RHI) task. Results show the system's effectiveness in conveying contact and texture cues, making it a potential tool to restore sensory feedback and enhance the embodiment in prosthetic users.
{"title":"Characterization, Experimental Validation and Pilot User Study of the Vibro-Inertial Bionic Enhancement System (VIBES).","authors":"Alessia Silvia Ivani, Federica Barontini, Manuel G Catalano, Giorgio Grioli, Matteo Bianchi, Antonio Bicchi","doi":"10.1109/TOH.2024.3435588","DOIUrl":"10.1109/TOH.2024.3435588","url":null,"abstract":"<p><p>This study presents the characterization and validation of the VIBES, a wearable vibrotactile device that provides high-frequency tactile information embedded in a prosthetic socket. A psychophysical characterization involving ten able-bodied participants is performed to compute the Just Noticeable Difference (JND) related to the discrimination of vibrotactile cues delivered on the skin in two forearm positions, with the goal of optimising vibrotactile actuator position to maximise perceptual response. Furthermore, system performance is validated and tested both with ten able-bodied participants and one prosthesis user considering three tasks. More specifically, in the Active Texture Identification, Slippage and Fragile Object Experiments, we investigate if the VIBES could enhance users' roughness discrimination and manual usability and dexterity. Finally, we test the effect of the vibrotactile system on prosthetic embodiment in a Rubber Hand Illusion (RHI) task. Results show the system's effectiveness in conveying contact and texture cues, making it a potential tool to restore sensory feedback and enhance the embodiment in prosthetic users.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2024-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141855404","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-29DOI: 10.1109/TOH.2024.3434975
Emma Treadway, Kristian Journet, Andrew Deering, Cora Lewis, Noelle Poquiz
Virtual damping is often employed to improve stability in virtual environments, but it has previously been found to bias perception of stiffness, with its effects differing when it is introduced locally within a wall/object or globally in both the wall and in freespace. Since many potential applications of haptic rendering involve not only comparisons between two environments, but also the ability to recognize rendered environments as belonging to different categories, it is important to understand the perceptual impacts of freespace and wall damping on stiffness classification ability. This study explores the effects of varying levels of freespace and wall damping on users' ability to classify virtual walls by their stiffness. Results indicate that freespace damping improves wall classification if the walls are damped, but will impair classification of undamped walls. These findings suggest that, in situations where users are expected to recognize and classify various stiffnesses, freespace damping can be a factor in narrowing or widening gaps in extended rate-hardness between softer and stiffer walls.
{"title":"Effects of Wall and Freespace Damping Levels on Virtual Wall Stiffness Classification.","authors":"Emma Treadway, Kristian Journet, Andrew Deering, Cora Lewis, Noelle Poquiz","doi":"10.1109/TOH.2024.3434975","DOIUrl":"https://doi.org/10.1109/TOH.2024.3434975","url":null,"abstract":"<p><p>Virtual damping is often employed to improve stability in virtual environments, but it has previously been found to bias perception of stiffness, with its effects differing when it is introduced locally within a wall/object or globally in both the wall and in freespace. Since many potential applications of haptic rendering involve not only comparisons between two environments, but also the ability to recognize rendered environments as belonging to different categories, it is important to understand the perceptual impacts of freespace and wall damping on stiffness classification ability. This study explores the effects of varying levels of freespace and wall damping on users' ability to classify virtual walls by their stiffness. Results indicate that freespace damping improves wall classification if the walls are damped, but will impair classification of undamped walls. These findings suggest that, in situations where users are expected to recognize and classify various stiffnesses, freespace damping can be a factor in narrowing or widening gaps in extended rate-hardness between softer and stiffer walls.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2024-07-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141792353","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}