Aihua Zheng , Zhiqi Ma , Yongqi Sun , Zi Wang , Chenglong Li , Jin Tang
{"title":"Flare-aware cross-modal enhancement network for multi-spectral vehicle Re-identification","authors":"Aihua Zheng , Zhiqi Ma , Yongqi Sun , Zi Wang , Chenglong Li , Jin Tang","doi":"10.1016/j.inffus.2024.102800","DOIUrl":null,"url":null,"abstract":"<div><div>Multi-spectral vehicle Re-identification (Re-ID) aims to incorporate complementary visible and infrared information to tackle the challenge of re-identifying vehicles in complex lighting conditions. However, in harsh environments, the discriminative cues in RGB (visible) and NI (near infrared) modalities are significantly lost by the strong flare from vehicle lamps or the sunlight. To handle this problem, we propose a Flare-Aware Cross-modal Enhancement Network (FACENet) to adaptively restore the flare-corrupted RGB and NI features with the guidance from the flare-immunized TI (thermal infrared) spectra. First, to reduce the influence of locally degraded appearance by the intense flare, we propose a Mutual Flare Mask Prediction (MFMP) module to jointly obtain the flare-corrupted masks in RGB and NI modalities in a self-supervised manner. Second, to utilize the flare-immunized TI information to enhance the masked RGB and NI, we propose a Flare-aware Cross-modal Enhancement module (FCE) to adaptively guide feature extraction of masked RGB and NI spectra with the prior flare-immunized knowledge from the TI spectra. Third, to mine the common semantic information of RGB and NI, and alleviate the severe semantic loss in the NI spectra using TI, we propose a Multi-modality Consistency (MC) loss to enhance the semantic consistency among the three modalities. Finally, to evaluate the proposed FACENet while handling the intense flare problem, we contribute a new multi-spectral vehicle Re-ID dataset, named WMVEID863 with additional challenges, such as motion blur, huge background changes, and especially intense flare degradation. Comprehensive experiments on both the newly collected dataset and public benchmark multi-spectral vehicle Re-ID datasets verify the superior performance of the proposed FACENet compared to the state-of-the-art methods, especially in handling the strong flares. The codes and dataset will be released at <span><span>this link.</span><svg><path></path></svg></span></div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"116 ","pages":"Article 102800"},"PeriodicalIF":14.7000,"publicationDate":"2024-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253524005785","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Multi-spectral vehicle Re-identification (Re-ID) aims to incorporate complementary visible and infrared information to tackle the challenge of re-identifying vehicles in complex lighting conditions. However, in harsh environments, the discriminative cues in RGB (visible) and NI (near infrared) modalities are significantly lost by the strong flare from vehicle lamps or the sunlight. To handle this problem, we propose a Flare-Aware Cross-modal Enhancement Network (FACENet) to adaptively restore the flare-corrupted RGB and NI features with the guidance from the flare-immunized TI (thermal infrared) spectra. First, to reduce the influence of locally degraded appearance by the intense flare, we propose a Mutual Flare Mask Prediction (MFMP) module to jointly obtain the flare-corrupted masks in RGB and NI modalities in a self-supervised manner. Second, to utilize the flare-immunized TI information to enhance the masked RGB and NI, we propose a Flare-aware Cross-modal Enhancement module (FCE) to adaptively guide feature extraction of masked RGB and NI spectra with the prior flare-immunized knowledge from the TI spectra. Third, to mine the common semantic information of RGB and NI, and alleviate the severe semantic loss in the NI spectra using TI, we propose a Multi-modality Consistency (MC) loss to enhance the semantic consistency among the three modalities. Finally, to evaluate the proposed FACENet while handling the intense flare problem, we contribute a new multi-spectral vehicle Re-ID dataset, named WMVEID863 with additional challenges, such as motion blur, huge background changes, and especially intense flare degradation. Comprehensive experiments on both the newly collected dataset and public benchmark multi-spectral vehicle Re-ID datasets verify the superior performance of the proposed FACENet compared to the state-of-the-art methods, especially in handling the strong flares. The codes and dataset will be released at this link.
多光谱车辆再识别(Re-ID)旨在结合互补的可见光和红外信息,解决在复杂照明条件下重新识别车辆的难题。然而,在恶劣的环境中,RGB(可见光)和 NI(近红外)模式的分辨线索会因车灯或阳光的强烈耀斑而严重丢失。为了解决这个问题,我们提出了一种耀斑感知的跨模态增强网络(FACENet),通过耀斑免疫的 TI(热红外)光谱的引导,自适应地恢复被耀斑破坏的 RGB 和 NI 特征。首先,为了减少强烈耀斑造成的局部外观劣化的影响,我们提出了一个相互耀斑掩码预测(MFMP)模块,以自我监督的方式联合获取 RGB 和 NI 模式中被耀斑破坏的掩码。其次,为了利用耀斑免疫的 TI 信息来增强 RGB 和 NI 掩膜,我们提出了耀斑感知跨模态增强模块 (FCE),利用来自 TI 光谱的先验耀斑免疫知识自适应地指导 RGB 和 NI 掩膜光谱的特征提取。第三,为了挖掘 RGB 和 NI 的共同语义信息,并减轻使用 TI 时 NI 光谱中严重的语义损失,我们提出了多模态一致性(MC)损失,以增强三种模态之间的语义一致性。最后,为了在处理强烈耀斑问题的同时评估所提出的 FACENet,我们提供了一个新的多光谱车辆再识别数据集,名为 WMVEID863,该数据集面临更多挑战,如运动模糊、巨大的背景变化,尤其是强烈的耀斑退化。在新收集的数据集和公共基准多光谱车辆再识别数据集上进行的综合实验验证了与最先进的方法相比,所提出的 FACENet 性能优越,尤其是在处理强耀斑方面。代码和数据集将在此链接发布。
期刊介绍:
Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.