Underwater images play a crucial role in underwater exploration and resource development, but their quality often degrades in complex underwater scenarios. However, existing methods mainly focus on specific scenarios, and exhibit limited generalization ability when addressing complex underwater scenarios. Enhancing their applicability is therefore essential for accurate quality assessment of underwater images across diverse scenarios. This paper proposes an Underwater Image Quality Assessment (UIQA) method that combines the advantages of staircase network and Transformer, focusing on efficiently capturing and integrating image features at different scales. Initially, multi-scale feature extraction is performed to obtain information from images at various levels. Following this, a Staircase Feature (SF) module progressively integrates features from shallow to deep layers, achieving fusion of cross-scale information. Additionally, the Cross-Scale Transformer (CST) module effectively merges information from multiple scales using self-attention mechanisms. By concatenating the output features of both modules, the model gains an understanding of image content across global and local ranges. Subsequently, a regression module is utilized to generate quality scores. Finally, meta-learning optimizes the model’s learning process, enabling adaptation to new data for accurate image quality prediction across diverse scenarios. Experiments show superior accuracy and stability on underwater datasets, with additional tests on natural scenes that demonstrate broader applicability. Cross-dataset experiments validate the generalization capability of the proposed method. The source code will be made available at https://github.com/dart-into/UIQA-MSST.
扫码关注我们
求助内容:
应助结果提醒方式:
