Global and local semantic enhancement of samples for cross-modal hashing

IF 5.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Neurocomputing Pub Date : 2024-10-21 DOI:10.1016/j.neucom.2024.128678
Shaohua Teng , Yongqi Chen , Zefeng Zheng , Wei Zhang , Peipei Kang , Naiqi Wu
{"title":"Global and local semantic enhancement of samples for cross-modal hashing","authors":"Shaohua Teng ,&nbsp;Yongqi Chen ,&nbsp;Zefeng Zheng ,&nbsp;Wei Zhang ,&nbsp;Peipei Kang ,&nbsp;Naiqi Wu","doi":"10.1016/j.neucom.2024.128678","DOIUrl":null,"url":null,"abstract":"<div><div>Hashing becomes popular in cross-modal retrieval due to its exceptional performance in both search and storage. However, existing cross-modal hashing (CMH) methods may (a) neglect to learn sufficient modal-specific information, and (b) fail to fully exploit sample semantics. To address these issues, we propose a method called Semantic Enhancement of Sample Hashing (SESH). First, SESH employs a global modal-specific learning strategy to draw overall shared information and global modal-specific information by factoring the mapping matrix. Second, SESH introduces manifold learning and a local modal-specific learning strategy to extract additional local modal-specific and modal-shared data under label guidance. Meanwhile, local modal-specific information is integrated with global modal-specific details to add rich modal-specific information. Third, SESH uses discrete maximum similarity and orthogonal constraint transformation to enhance both global and local semantic information, embedding more discriminative information into the Hamming space. Finally, an efficient discrete optimization algorithm is proposed to generate the hash codes directly. Experiments on three datasets demonstrate the superior performance of SESH. The source code will be available at <span><span>https://github.com/kokorording/SESH</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"614 ","pages":"Article 128678"},"PeriodicalIF":5.5000,"publicationDate":"2024-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224014498","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Hashing becomes popular in cross-modal retrieval due to its exceptional performance in both search and storage. However, existing cross-modal hashing (CMH) methods may (a) neglect to learn sufficient modal-specific information, and (b) fail to fully exploit sample semantics. To address these issues, we propose a method called Semantic Enhancement of Sample Hashing (SESH). First, SESH employs a global modal-specific learning strategy to draw overall shared information and global modal-specific information by factoring the mapping matrix. Second, SESH introduces manifold learning and a local modal-specific learning strategy to extract additional local modal-specific and modal-shared data under label guidance. Meanwhile, local modal-specific information is integrated with global modal-specific details to add rich modal-specific information. Third, SESH uses discrete maximum similarity and orthogonal constraint transformation to enhance both global and local semantic information, embedding more discriminative information into the Hamming space. Finally, an efficient discrete optimization algorithm is proposed to generate the hash codes directly. Experiments on three datasets demonstrate the superior performance of SESH. The source code will be available at https://github.com/kokorording/SESH.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
跨模态散列样本的全局和局部语义增强
由于哈希算法在搜索和存储方面的卓越性能,它在跨模态检索中非常流行。然而,现有的跨模态散列(CMH)方法可能会(a)忽略学习足够的特定模态信息,(b)无法充分利用样本语义。为了解决这些问题,我们提出了一种名为 "样本散列语义增强"(SESH)的方法。首先,SESH 采用全局特定模态学习策略,通过对映射矩阵进行因式分解,提取总体共享信息和全局特定模态信息。其次,SESH 引入了流形学习和局部模态特定学习策略,在标签引导下提取额外的局部模态特定数据和模态共享数据。同时,局部模态特定信息与全局模态特定细节相结合,增加了丰富的模态特定信息。第三,SESH 利用离散最大相似性和正交约束变换来增强全局和局部语义信息,将更多的判别信息嵌入汉明空间。最后,我们提出了一种高效的离散优化算法来直接生成哈希代码。三个数据集的实验证明了 SESH 的卓越性能。源代码可在 https://github.com/kokorording/SESH 上获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
期刊最新文献
Editorial Board Virtual sample generation for small sample learning: A survey, recent developments and future prospects Adaptive selection of spectral–spatial features for hyperspectral image classification using a modified-CBAM-based network FPGA-based component-wise LSTM training accelerator for neural granger causality analysis Multi-sensor information fusion in Internet of Vehicles based on deep learning: A review
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1