An MS-TCN based spatiotemporal model with three-axis tactile for enhancing flexible printed circuit assembly

Zengxin Kang, Jing Cui, Yijie Wang, Zhikai Hu, Zhongyi Chu
{"title":"An MS-TCN based spatiotemporal model with three-axis tactile for enhancing flexible printed circuit assembly","authors":"Zengxin Kang, Jing Cui, Yijie Wang, Zhikai Hu, Zhongyi Chu","doi":"10.1108/ria-10-2023-0136","DOIUrl":null,"url":null,"abstract":"Purpose\nCurrent flexible printed circuit (FPC) assembly relies heavily on manual labor, limiting capacity and increasing costs. Small FPC size makes automation challenging as terminals can be visually occluded. The purpose of this study is to use 3D tactile sensing to mimic human manual mating skills for enabling sensing offset between FPC terminals (FPC-t) and FPC mating slots (FPC-s) under visual occlusion.\n\nDesign/methodology/approach\nThe proposed model has three stages: spatial encoding, offset estimation and action strategy. The spatial encoder maps sparse 3D tactile data into a compact 1D feature capturing valid spatial assembly information to enable temporal processing. To compensate for low sensor resolution, consecutive spatial features are input to a multistage temporal convolutional network which estimates alignment offsets. The robot then performs alignment or mating actions based on the estimated offsets.\n\nFindings\nExperiments are conducted on a Redmi Note 4 smartphone assembly platform. Compared to other models, the proposed approach achieves superior offset estimation. Within limited trials, it successfully assembles FPCs under visual occlusion using three-axis tactile sensing.\n\nOriginality/value\nA spatial encoder is designed to encode three-axis tactile data into feature maps, overcoming multistage temporal convolution network’s (MS-TCN) inability to directly process such input. Modifying the output to estimate assembly offsets with related motion semantics overcame MS-TCN’s segmentation points output, unable to meet assembly monitoring needs. Training and testing the improved MS-TCN on an FPC data set demonstrated accurate monitoring of the full process. An assembly platform verified performance on automated FPC assembly.\n","PeriodicalId":501194,"journal":{"name":"Robotic Intelligence and Automation","volume":"124 37","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Robotic Intelligence and Automation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/ria-10-2023-0136","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Purpose Current flexible printed circuit (FPC) assembly relies heavily on manual labor, limiting capacity and increasing costs. Small FPC size makes automation challenging as terminals can be visually occluded. The purpose of this study is to use 3D tactile sensing to mimic human manual mating skills for enabling sensing offset between FPC terminals (FPC-t) and FPC mating slots (FPC-s) under visual occlusion. Design/methodology/approach The proposed model has three stages: spatial encoding, offset estimation and action strategy. The spatial encoder maps sparse 3D tactile data into a compact 1D feature capturing valid spatial assembly information to enable temporal processing. To compensate for low sensor resolution, consecutive spatial features are input to a multistage temporal convolutional network which estimates alignment offsets. The robot then performs alignment or mating actions based on the estimated offsets. Findings Experiments are conducted on a Redmi Note 4 smartphone assembly platform. Compared to other models, the proposed approach achieves superior offset estimation. Within limited trials, it successfully assembles FPCs under visual occlusion using three-axis tactile sensing. Originality/value A spatial encoder is designed to encode three-axis tactile data into feature maps, overcoming multistage temporal convolution network’s (MS-TCN) inability to directly process such input. Modifying the output to estimate assembly offsets with related motion semantics overcame MS-TCN’s segmentation points output, unable to meet assembly monitoring needs. Training and testing the improved MS-TCN on an FPC data set demonstrated accurate monitoring of the full process. An assembly platform verified performance on automated FPC assembly.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于 MS-TCN 的时空模型与三轴触觉,用于增强柔性印刷电路组装能力
目的目前的柔性印刷电路 (FPC) 组装主要依赖人工,限制了产能并增加了成本。FPC 尺寸小,端子可能被视觉遮挡,因此自动化具有挑战性。本研究的目的是利用三维触觉传感技术来模拟人类的手动装配技能,以便在视觉遮挡的情况下感应 FPC 端子(FPC-t)和 FPC 插槽(FPC-s)之间的偏移量。空间编码器将稀疏的三维触觉数据映射为紧凑的一维特征,捕捉有效的空间装配信息,以便进行时间处理。为了弥补传感器分辨率低的问题,连续的空间特征被输入到多级时间卷积网络,该网络可估算对齐偏移。然后,机器人根据估算的偏移量执行对齐或配对操作。 实验结果在红米 Note 4 智能手机组装平台上进行了实验。与其他模型相比,所提出的方法实现了出色的偏移估算。原创性/价值设计了一种空间编码器,用于将三轴触觉数据编码为特征图,克服了多级时间卷积网络(MS-TCN)无法直接处理此类输入的问题。修改输出以估算具有相关运动语义的装配偏移,克服了 MS-TCN 的分割点输出无法满足装配监控需求的问题。在一个 FPC 数据集上对改进后的 MS-TCN 进行了培训和测试,结果表明可以准确监控整个流程。装配平台验证了自动 FPC 装配的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Indoor fixed-point hovering control for UAVs based on visual inertial SLAM Design and performance analysis of different cambered wings for flapping-wing aerial vehicles based on wind tunnel test A novel framework inspired by human behavior for peg-in-hole assembly Development of vision–based SLAM: from traditional methods to multimodal fusion An MS-TCN based spatiotemporal model with three-axis tactile for enhancing flexible printed circuit assembly
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1