Drones and other flying objects can be regarded as small targets from a long-distance perspective. Considering the occlusion and interference caused by the external environment, the infrared detection methods are adopted to help identify and manage small aerial targets. However, remote infrared imaging often leads to small target feature detail loss. And the general methods have low detection efficiency, difficult to deeply extract target features. To better address the above problems, we propose an attentional dual-stream interactive perception network (ADIPNet) in this paper. Based on dual-stream U-Net, ADIPNet mainly combines the multi-patch series-parallel attention module (MSPA), edge anchoring module with regret (EAR), context scene perception module (CSP) and dual-stream interaction fusion module (DSIF). MSPA manually constructs the weight of patch regions at multiple scales and then performs the nested self-attention so as to fully mine global target information. EAR unites two types of global features using local mapping and matrix product, which helps accurately capture small target edge. CSP exchanges context information multiple times and conducts mutual complementation of semantic scenarios to enhances the perception of small target features. Finally, DSIF conducts cross attention for high-level encoded features on double U-Nets, further improving the network’s understanding of complex scenario information. The proposed ADIPNet alleviates the insufficient feature extraction of infrared small targets. Compared with other state-of-the-art methods, mIoU respectively reaches 80.52% and 72.54% on two large infrared datasets. It achieves more accurate detection of small aerial targets with low operating cost, possessing potential application prospect in various infrared surveillance systems.
扫码关注我们
求助内容:
应助结果提醒方式:
