Fabric pilling image segmentation by embedding dual-attention mechanism U-Net network

IF 1.6 4区 工程技术 Q2 MATERIALS SCIENCE, TEXTILES Textile Research Journal Pub Date : 2024-05-03 DOI:10.1177/00405175241246735
Yu Yan, Yanjun Tan, Pengfu Gao, Qiuyu Yu, Yuntao Deng
{"title":"Fabric pilling image segmentation by embedding dual-attention mechanism U-Net network","authors":"Yu Yan, Yanjun Tan, Pengfu Gao, Qiuyu Yu, Yuntao Deng","doi":"10.1177/00405175241246735","DOIUrl":null,"url":null,"abstract":"The initial step in fabric pilling rating is to segment and localize the pilling region, which is a crucial and challenging task. This paper presents a fabric puckering image segmentation method that is integrated into a U-Net network with a dual-attention mechanism. We have enhanced the fully convolutional neural network (U-Net) model by incorporating the dual-attention mechanism. This modification has resulted in a powerful feature extraction capability, enabling the objective and accurate segmentation of the fabric puckering region. We refer to this improved model as the dual-attention U-Net. The network model for fabric pilling feature extraction adopts the improved VGG16 model architecture as its encoding part. The model parameters are initialized with VGG16 pre-training weights to accelerate convergence speed. Second, the feature fusion between the corresponding layers of the encoding part and the decoding part was fed into the dual-attention mechanism module to strengthen the weight values of the fabric pilling region adaptively, which made the model more focused on the fabric pilling target region; Third, the dual-attention U-Net model was trained by data augmentation and migration learning strategies to prevent overfitting; Finally, the performance of the model was evaluated with the collected fabric pilling data set. The results of the experiments indicate that the claimed dual-attention U-Net model is superior to the typical U-Net model in terms of accuracy and precision. The dual-attention U-Net model achieved 99.03% accuracy for fabric pilling segmentation.","PeriodicalId":22323,"journal":{"name":"Textile Research Journal","volume":"61 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Textile Research Journal","FirstCategoryId":"88","ListUrlMain":"https://doi.org/10.1177/00405175241246735","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, TEXTILES","Score":null,"Total":0}
引用次数: 0

Abstract

The initial step in fabric pilling rating is to segment and localize the pilling region, which is a crucial and challenging task. This paper presents a fabric puckering image segmentation method that is integrated into a U-Net network with a dual-attention mechanism. We have enhanced the fully convolutional neural network (U-Net) model by incorporating the dual-attention mechanism. This modification has resulted in a powerful feature extraction capability, enabling the objective and accurate segmentation of the fabric puckering region. We refer to this improved model as the dual-attention U-Net. The network model for fabric pilling feature extraction adopts the improved VGG16 model architecture as its encoding part. The model parameters are initialized with VGG16 pre-training weights to accelerate convergence speed. Second, the feature fusion between the corresponding layers of the encoding part and the decoding part was fed into the dual-attention mechanism module to strengthen the weight values of the fabric pilling region adaptively, which made the model more focused on the fabric pilling target region; Third, the dual-attention U-Net model was trained by data augmentation and migration learning strategies to prevent overfitting; Finally, the performance of the model was evaluated with the collected fabric pilling data set. The results of the experiments indicate that the claimed dual-attention U-Net model is superior to the typical U-Net model in terms of accuracy and precision. The dual-attention U-Net model achieved 99.03% accuracy for fabric pilling segmentation.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
通过嵌入双重关注机制 U-Net 网络进行织物起球图像分割
织物起球评级的第一步是分割和定位起球区域,这是一项关键而又具有挑战性的任务。本文提出了一种织物起球图像分割方法,该方法集成在具有双重关注机制的 U-Net 网络中。我们在全卷积神经网络(U-Net)模型中加入了双重关注机制。这一改进带来了强大的特征提取能力,从而能够客观、准确地分割织物起皱区域。我们将这一改进模型称为双注意 U-Net 模型。织物起球特征提取网络模型采用改进的 VGG16 模型结构作为编码部分。模型参数采用 VGG16 预训练权重初始化,以加快收敛速度。其次,将编码部分和解码部分对应层之间的特征融合反馈到双关注机制模块中,自适应地加强织物起毛起球区域的权重值,使模型更加关注织物起毛起球目标区域;第三,采用数据增强和迁移学习策略训练双关注 U-Net 模型,防止模型过拟合;最后,利用收集到的织物起毛起球数据集评估模型的性能。实验结果表明,双注意 U-Net 模型在准确度和精确度方面都优于典型的 U-Net 模型。双注意 U-Net 模型的织物起毛起球分割准确率达到 99.03%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Textile Research Journal
Textile Research Journal 工程技术-材料科学:纺织
CiteScore
4.00
自引率
21.70%
发文量
309
审稿时长
1.5 months
期刊介绍: The Textile Research Journal is the leading peer reviewed Journal for textile research. It is devoted to the dissemination of fundamental, theoretical and applied scientific knowledge in materials, chemistry, manufacture and system sciences related to fibers, fibrous assemblies and textiles. The Journal serves authors and subscribers worldwide, and it is selective in accepting contributions on the basis of merit, novelty and originality.
期刊最新文献
A review of deep learning and artificial intelligence in dyeing, printing and finishing A review of deep learning within the framework of artificial intelligence for enhanced fiber and yarn quality Reconstructing hyperspectral images of textiles from a single RGB image utilizing the multihead self-attention mechanism Study on the thermo-physiological comfort properties of cotton/polyester combination yarn-based double-layer knitted fabrics Study on the relationship between blending uniformity and yarn performance of blended yarn
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1