{"title":"Enhanced Concealed Object Detection Method for MMW Security Images Based on YOLOv8 Framework With ESFF and HSAFF","authors":"Shuliang Gui;Haitao Tian;Yizhe Wang;Sihang Dang;Ze Li;Kaikai Liu;Zengshan Tian","doi":"10.1109/JSEN.2024.3524441","DOIUrl":null,"url":null,"abstract":"Millimeter-wave (MMW) imaging technology has been widely used in crowded areas such as airports and railway stations for concealed object detection (COD), owing to its characteristics of privacy and safety. However, the low signal-to-noise ratio (SNR) and low resolution in MMW security images lead to indistinct edges of concealed objects and a significant similarity to the human background. These limitations constrain the accurate and rapid detection of concealed objects on individuals. This article proposes a concealed objects detector that uses an enhanced You Only Look Once (YOLO) network, incorporating spatial, edge, and multiscale information to address the issues above. First, an efficient adaptive denoising method is designed to enhance image clarity. Second, considering the lack of prominent edge features in MMW images, an edge-spatial feature fusion (ESFF) module is introduced. This module enhances the network’s ability to learn edge features by combining them with spatial detail information. In addition, this article proposes a hierarchical scale-aware feature fusion (HSAFF) module to address the issue of high similarity between targets and background textures that impairs traditional detection networks, which can effectively reduce classification errors and false detections. Finally, the ESFF and HSAFF modules are integrated into the detection network based on the YOLOv8 framework. The experimental results on the MMW image dataset demonstrate that the proposed model effectively reduces classification and false detection losses, achieving mean average precision (mAP) @0.5 and mAP@ [0.5:0.95] of <inline-formula> <tex-math>${98}.{3}\\%$ </tex-math></inline-formula> and <inline-formula> <tex-math>${81}.{5}\\%$ </tex-math></inline-formula>, respectively, while the mAP of the proposed method is <inline-formula> <tex-math>${5}.{1}\\%$ </tex-math></inline-formula> and <inline-formula> <tex-math>${4}\\%$ </tex-math></inline-formula> higher than the baseline model, and surpassing other detection models.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 4","pages":"7630-7641"},"PeriodicalIF":4.3000,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10832507/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Millimeter-wave (MMW) imaging technology has been widely used in crowded areas such as airports and railway stations for concealed object detection (COD), owing to its characteristics of privacy and safety. However, the low signal-to-noise ratio (SNR) and low resolution in MMW security images lead to indistinct edges of concealed objects and a significant similarity to the human background. These limitations constrain the accurate and rapid detection of concealed objects on individuals. This article proposes a concealed objects detector that uses an enhanced You Only Look Once (YOLO) network, incorporating spatial, edge, and multiscale information to address the issues above. First, an efficient adaptive denoising method is designed to enhance image clarity. Second, considering the lack of prominent edge features in MMW images, an edge-spatial feature fusion (ESFF) module is introduced. This module enhances the network’s ability to learn edge features by combining them with spatial detail information. In addition, this article proposes a hierarchical scale-aware feature fusion (HSAFF) module to address the issue of high similarity between targets and background textures that impairs traditional detection networks, which can effectively reduce classification errors and false detections. Finally, the ESFF and HSAFF modules are integrated into the detection network based on the YOLOv8 framework. The experimental results on the MMW image dataset demonstrate that the proposed model effectively reduces classification and false detection losses, achieving mean average precision (mAP) @0.5 and mAP@ [0.5:0.95] of ${98}.{3}\%$ and ${81}.{5}\%$ , respectively, while the mAP of the proposed method is ${5}.{1}\%$ and ${4}\%$ higher than the baseline model, and surpassing other detection models.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice