Touchformer: A Transformer-Based Two-Tower Architecture for Tactile Temporal Signal Classification

IF 2.4 3区 计算机科学 Q2 COMPUTER SCIENCE, CYBERNETICS IEEE Transactions on Haptics Pub Date : 2023-12-25 DOI:10.1109/TOH.2023.3346956
Chongyu Liu;Hong Liu;Hu Chen;Wenchao Du;Hongyu Yang
{"title":"Touchformer: A Transformer-Based Two-Tower Architecture for Tactile Temporal Signal Classification","authors":"Chongyu Liu;Hong Liu;Hu Chen;Wenchao Du;Hongyu Yang","doi":"10.1109/TOH.2023.3346956","DOIUrl":null,"url":null,"abstract":"Haptic temporal signal recognition plays an important supporting role in robot perception. This paper investigates how to improve classification performance on multiple types of haptic temporal signal datasets using a Transformer model structure. By analyzing the feature representation of haptic temporal signals, a Transformer-based two-tower structural model, called Touchformer, is proposed to extract temporal and spatial features separately and integrate them using a self-attention mechanism for classification. To address the characteristics of small sample datasets, data augmentation is employed to improve the stability of the dataset. Adaptations to the overall architecture of the model and the training and optimization procedures are made to improve the recognition performance and robustness of the model. Experimental comparisons on three publicly available datasets demonstrate that the Touchformer model significantly outperforms the benchmark model, indicating our approach's effectiveness and providing a new solution for robot perception.","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"17 3","pages":"396-404"},"PeriodicalIF":2.4000,"publicationDate":"2023-12-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Haptics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10373890/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0

Abstract

Haptic temporal signal recognition plays an important supporting role in robot perception. This paper investigates how to improve classification performance on multiple types of haptic temporal signal datasets using a Transformer model structure. By analyzing the feature representation of haptic temporal signals, a Transformer-based two-tower structural model, called Touchformer, is proposed to extract temporal and spatial features separately and integrate them using a self-attention mechanism for classification. To address the characteristics of small sample datasets, data augmentation is employed to improve the stability of the dataset. Adaptations to the overall architecture of the model and the training and optimization procedures are made to improve the recognition performance and robustness of the model. Experimental comparisons on three publicly available datasets demonstrate that the Touchformer model significantly outperforms the benchmark model, indicating our approach's effectiveness and providing a new solution for robot perception.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Touchformer:基于变压器的双塔架构,用于触觉时间信号分类。
触觉时间信号识别在机器人感知中起着重要的辅助作用。本文研究了如何利用变换器模型结构提高多种类型触觉时间信号数据集的分类性能。通过分析触觉时空信号的特征表示,提出了一种基于 Transformer 的双塔结构模型,即 Touchformer,该模型可分别提取时空特征,并利用自注意机制整合这些特征进行分类。针对小样本数据集的特点,采用了数据增强技术来提高数据集的稳定性。对模型的整体架构以及训练和优化程序进行了调整,以提高识别性能和模型的鲁棒性。在三个公开数据集上进行的实验比较表明,Touchformer 模型的性能明显优于基准模型,这表明我们的方法非常有效,并为机器人感知提供了新的解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Haptics
IEEE Transactions on Haptics COMPUTER SCIENCE, CYBERNETICS-
CiteScore
5.90
自引率
13.80%
发文量
109
审稿时长
>12 weeks
期刊介绍: IEEE Transactions on Haptics (ToH) is a scholarly archival journal that addresses the science, technology, and applications associated with information acquisition and object manipulation through touch. Haptic interactions relevant to this journal include all aspects of manual exploration and manipulation of objects by humans, machines and interactions between the two, performed in real, virtual, teleoperated or networked environments. Research areas of relevance to this publication include, but are not limited to, the following topics: Human haptic and multi-sensory perception and action, Aspects of motor control that explicitly pertain to human haptics, Haptic interactions via passive or active tools and machines, Devices that sense, enable, or create haptic interactions locally or at a distance, Haptic rendering and its association with graphic and auditory rendering in virtual reality, Algorithms, controls, and dynamics of haptic devices, users, and interactions between the two, Human-machine performance and safety with haptic feedback, Haptics in the context of human-computer interactions, Systems and networks using haptic devices and interactions, including multi-modal feedback, Application of the above, for example in areas such as education, rehabilitation, medicine, computer-aided design, skills training, computer games, driver controls, simulation, and visualization.
期刊最新文献
2024 Index IEEE Transactions on Haptics Vol. 17 Table of Contents A Novel Ungrounded Haptic Device for Generation and Orientation of Force and Torque Feedbacks. HM-Array: A Novel Haptic Magnetism-based Leader-follower Platform for Minimally Invasive Robotic Surgery. Perceptual Constancy in the Speed Dependence of Friction During Active Tactile Exploration
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1