As a common and significant physical phenomenon in the global ocean system, ocean fronts have profound impacts on marine environments, ecosystems, and even global climate. Traditional methods for detecting ocean fronts typically use gradient thresholds to distinguish image pixels. When background noise is too high, excessive noise gradients can lead to ambiguous recognition results. Currently, widely used deep learning methods suffer from issues such as a lack of interpretability and insufficient multi-scale feature fusion during the detection process of ocean fronts. To address these problems, this paper proposes a Dynamic Gradient Orientation and Multi-scale Fusion Network, which integrates physical priors with deep learning techniques to achieve higher precision in ocean front detection. Using 30 years (1993–2022) of high-resolution sea surface temperature data for the Northwest Pacific Kuroshio Sea area, we constructed a dynamic gradient orientation angle constraint mechanism (DACM) and a multi-scale gradient fusion mechanism (MSGF). To further enhance the model's interpretability, we improved the detection framework based on you only look once version 11 (YOLOv11), introducing a cross-scale Transformer, dynamic snake convolution, and scale-aware feature fusion modules, making it suitable for ocean front detection. The experimental results show that our method achieved an accuracy and precision of 84.1 % and 79 %, respectively, on the testing set. The ablation experiment verified that the multi-scale fusion mechanism increased the weak front recall rate by 20 %. The results provide a feasible scheme for the deep integration of physical and data-driven ocean front detection, which has certain application value for the analysis of dynamic ocean processes and climate change research.
扫码关注我们
求助内容:
应助结果提醒方式:
