Attend, Correct And Focus: A Bidirectional Correct Attention Network For Image-Text Matching

Yang Liu, Huaqiu Wang, Fanyang Meng, Mengyuan Liu, Hong Liu
{"title":"Attend, Correct And Focus: A Bidirectional Correct Attention Network For Image-Text Matching","authors":"Yang Liu, Huaqiu Wang, Fanyang Meng, Mengyuan Liu, Hong Liu","doi":"10.1109/ICIP42928.2021.9506438","DOIUrl":null,"url":null,"abstract":"Image-text matching task aims to learn the fine-grained correspondences between images and sentences. Existing methods use attention mechanism to learn the correspondences by attending to all fragments without considering the relationship between fragments and global semantics, which inevitably lead to semantic misalignment among irrelevant fragments. To this end, we propose a Bidirectional Correct Attention Network (BCAN), which leverages global similarities and local similarities to reassign the attention weight, to avoid such semantic misalignment. Specifically, we introduce a global correct unit to correct the attention focused on relevant fragments in irrelevant semantics. A local correct unit is used to correct the attention focused on irrelevant fragments in relevant semantics. Experiments on Flickr30K and MSCOCO datasets verify the effectiveness of our proposed BCAN by outperforming both previous attention-based methods and state-of-the-art methods. Code can be found at: https://github.com/liuyyy111/BCAN.","PeriodicalId":314429,"journal":{"name":"2021 IEEE International Conference on Image Processing (ICIP)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Image Processing (ICIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIP42928.2021.9506438","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Image-text matching task aims to learn the fine-grained correspondences between images and sentences. Existing methods use attention mechanism to learn the correspondences by attending to all fragments without considering the relationship between fragments and global semantics, which inevitably lead to semantic misalignment among irrelevant fragments. To this end, we propose a Bidirectional Correct Attention Network (BCAN), which leverages global similarities and local similarities to reassign the attention weight, to avoid such semantic misalignment. Specifically, we introduce a global correct unit to correct the attention focused on relevant fragments in irrelevant semantics. A local correct unit is used to correct the attention focused on irrelevant fragments in relevant semantics. Experiments on Flickr30K and MSCOCO datasets verify the effectiveness of our proposed BCAN by outperforming both previous attention-based methods and state-of-the-art methods. Code can be found at: https://github.com/liuyyy111/BCAN.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
注意、纠正和聚焦:一个用于图像-文本匹配的双向正确注意网络
图像-文本匹配任务旨在学习图像和句子之间的细粒度对应关系。现有方法采用注意机制,通过关注所有片段来学习对应关系,而没有考虑片段与全局语义的关系,这不可避免地导致不相关片段之间的语义错位。为此,我们提出了一种双向正确注意网络(BCAN),它利用全局相似度和局部相似度来重新分配注意权重,以避免这种语义错位。具体来说,我们引入了一个全局正确单元来纠正对不相关语义中相关片段的关注。局部纠错单元用于纠错相关语义中不相关片段的注意力。在Flickr30K和mscoo数据集上的实验验证了我们提出的BCAN的有效性,它优于以前的基于注意力的方法和最先进的方法。代码可以在https://github.com/liuyyy111/BCAN找到。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Deep Color Mismatch Correction In Stereoscopic 3d Images Weakly-Supervised Multiple Object Tracking Via A Masked Center Point Warping Loss A Parameter Efficient Multi-Scale Capsule Network Few Shot Learning For Infra-Red Object Recognition Using Analytically Designed Low Level Filters For Data Representation An Enhanced Reference Structure For Reference Picture Resampling (RPR) In VVC
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1