{"title":"Multi-band image fusion via perceptual framework and multiscale texture saliency","authors":"Zhihao Liu, Weiqi Jin, Dian Sheng, Li Li","doi":"10.1016/j.infrared.2025.105728","DOIUrl":null,"url":null,"abstract":"<div><div>Multi-band images provide valuable complementary information, play an important role in target detection and recognition in complex environments, and have become a key research direction. However, existing fusion methods rarely consider or adapt to more than two bands of images and are easily affected by external physical conditions (e.g., variations in sensor characteristics and environmental illumination). This paper proposes a multi-band image fusion method based on a perception framework and multiscale texture saliency. By introducing the perception framework and the human visual system (HVS) space, the source images are decomposed into detail, feature, and base layers according to the perception characteristics of the human eye for texture granularity. Gabor filters were used to obtain the saliency of the fine-grained textures in the detail layer, thereby selectively extracting detailed texture information. The saliency of the feature layer texture was calculated using a Hessian matrix. Fusion weights were then obtained based on the texture complexity at the current scale, allowing for the effective extraction of structural information from the source images. Finally, fusion was performed in the unified framework of the HVS space, and the fusion image was obtained through an inverse transformation. The experimental results indicate that the multiscale texture perceptual framework fusion (MSTPFF) method can effectively transfer textures of different scales in the source images to the fused image, thus preserving the unique details and structural texture information of the multiband images. This transfer aligns with the visual characteristics of the human eye and significantly enhances fusion quality.</div></div>","PeriodicalId":13549,"journal":{"name":"Infrared Physics & Technology","volume":"145 ","pages":"Article 105728"},"PeriodicalIF":3.1000,"publicationDate":"2025-01-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Infrared Physics & Technology","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1350449525000210","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
引用次数: 0
Abstract
Multi-band images provide valuable complementary information, play an important role in target detection and recognition in complex environments, and have become a key research direction. However, existing fusion methods rarely consider or adapt to more than two bands of images and are easily affected by external physical conditions (e.g., variations in sensor characteristics and environmental illumination). This paper proposes a multi-band image fusion method based on a perception framework and multiscale texture saliency. By introducing the perception framework and the human visual system (HVS) space, the source images are decomposed into detail, feature, and base layers according to the perception characteristics of the human eye for texture granularity. Gabor filters were used to obtain the saliency of the fine-grained textures in the detail layer, thereby selectively extracting detailed texture information. The saliency of the feature layer texture was calculated using a Hessian matrix. Fusion weights were then obtained based on the texture complexity at the current scale, allowing for the effective extraction of structural information from the source images. Finally, fusion was performed in the unified framework of the HVS space, and the fusion image was obtained through an inverse transformation. The experimental results indicate that the multiscale texture perceptual framework fusion (MSTPFF) method can effectively transfer textures of different scales in the source images to the fused image, thus preserving the unique details and structural texture information of the multiband images. This transfer aligns with the visual characteristics of the human eye and significantly enhances fusion quality.
期刊介绍:
The Journal covers the entire field of infrared physics and technology: theory, experiment, application, devices and instrumentation. Infrared'' is defined as covering the near, mid and far infrared (terahertz) regions from 0.75um (750nm) to 1mm (300GHz.) Submissions in the 300GHz to 100GHz region may be accepted at the editors discretion if their content is relevant to shorter wavelengths. Submissions must be primarily concerned with and directly relevant to this spectral region.
Its core topics can be summarized as the generation, propagation and detection, of infrared radiation; the associated optics, materials and devices; and its use in all fields of science, industry, engineering and medicine.
Infrared techniques occur in many different fields, notably spectroscopy and interferometry; material characterization and processing; atmospheric physics, astronomy and space research. Scientific aspects include lasers, quantum optics, quantum electronics, image processing and semiconductor physics. Some important applications are medical diagnostics and treatment, industrial inspection and environmental monitoring.