High-quality one-shot interactive segmentation for remote sensing images via hybrid adapter-enhanced foundation models

Zhili Zhang , Xiangyun Hu , Yue Yang , Bingnan Yang , Kai Deng , Hengming Dai , Mi Zhang
{"title":"High-quality one-shot interactive segmentation for remote sensing images via hybrid adapter-enhanced foundation models","authors":"Zhili Zhang ,&nbsp;Xiangyun Hu ,&nbsp;Yue Yang ,&nbsp;Bingnan Yang ,&nbsp;Kai Deng ,&nbsp;Hengming Dai ,&nbsp;Mi Zhang","doi":"10.1016/j.jag.2025.104466","DOIUrl":null,"url":null,"abstract":"<div><div>Interactive segmentation of remote sensing images enables the rapid generation of annotated samples, providing training samples for deep learning algorithms and facilitating high-quality extraction and classification for remote sensing objects. However, existing interactive segmentation methods, such as SAM, are primarily designed for natural images and show inefficiencies when applied to remote sensing images. These methods often require multiple interactions to achieve satisfactory labeling results and frequently struggle to obtain precise target boundaries. To address these limitations, we propose a high-quality one-shot interactive segmentation method (OSISeg) based on the fine-tuning of foundation models, tailored for the efficient annotation of typical objects in remote sensing imagery. OSISeg utilizes robust visual priors from foundation models and implements a hybrid adapter-based strategy for fine-tuning these models. Specifically, It employs a parallel structure with hybrid adapter designs to adjust multi-head self-attention and feed-forward neural networks within foundation models, effectively aligning remote sensing image features for interactive segmentation tasks. Furthermore, the proposed OSISeg integrates point, box, and scribble prompts, facilitating high-quality segmentation only using one prompt through a lightweight decoder. Experimental results on multiple datasets—including buildings, water bodies, and woodlands—demonstrate that our method outperforms existing fine-tuning methods and significantly enhances the quality of one-shot interactive segmentation for typical remote sensing objects. This study highlights the potential of the proposed OSISeg to significantly accelerate sample annotation in remote sensing image labeling tasks, establishing it as a valuable tool for sample labeling in the field of remote sensing. Code is available at <span><span>https://github.com/zhilyzhang/OSISeg</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"139 ","pages":"Article 104466"},"PeriodicalIF":8.6000,"publicationDate":"2025-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S156984322500113X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0

Abstract

Interactive segmentation of remote sensing images enables the rapid generation of annotated samples, providing training samples for deep learning algorithms and facilitating high-quality extraction and classification for remote sensing objects. However, existing interactive segmentation methods, such as SAM, are primarily designed for natural images and show inefficiencies when applied to remote sensing images. These methods often require multiple interactions to achieve satisfactory labeling results and frequently struggle to obtain precise target boundaries. To address these limitations, we propose a high-quality one-shot interactive segmentation method (OSISeg) based on the fine-tuning of foundation models, tailored for the efficient annotation of typical objects in remote sensing imagery. OSISeg utilizes robust visual priors from foundation models and implements a hybrid adapter-based strategy for fine-tuning these models. Specifically, It employs a parallel structure with hybrid adapter designs to adjust multi-head self-attention and feed-forward neural networks within foundation models, effectively aligning remote sensing image features for interactive segmentation tasks. Furthermore, the proposed OSISeg integrates point, box, and scribble prompts, facilitating high-quality segmentation only using one prompt through a lightweight decoder. Experimental results on multiple datasets—including buildings, water bodies, and woodlands—demonstrate that our method outperforms existing fine-tuning methods and significantly enhances the quality of one-shot interactive segmentation for typical remote sensing objects. This study highlights the potential of the proposed OSISeg to significantly accelerate sample annotation in remote sensing image labeling tasks, establishing it as a valuable tool for sample labeling in the field of remote sensing. Code is available at https://github.com/zhilyzhang/OSISeg.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于混合适配器增强基础模型的遥感图像高质量一次交互式分割
对遥感图像进行交互式分割,可以快速生成带注释的样本,为深度学习算法提供训练样本,便于对遥感对象进行高质量的提取和分类。然而,现有的交互式分割方法,如SAM,主要是针对自然图像设计的,在应用于遥感图像时效率较低。这些方法通常需要多次相互作用才能获得满意的标记结果,并且经常难以获得精确的目标边界。为了解决这些问题,我们提出了一种基于基础模型微调的高质量一次性交互分割方法(OSISeg),为遥感图像中典型目标的高效标注量身定制。OSISeg利用来自基础模型的健壮的视觉先验,并实现基于混合适配器的策略来微调这些模型。具体而言,它采用混合适配器设计的并行结构来调整基础模型中的多头自注意和前馈神经网络,有效地对齐遥感图像特征以进行交互式分割任务。此外,建议的OSISeg集成了点、框和涂鸦提示,通过轻量级解码器仅使用一个提示即可实现高质量的分割。在建筑物、水体和林地等多个数据集上的实验结果表明,该方法优于现有的微调方法,显著提高了典型遥感对象的单次交互式分割质量。本研究强调了所提出的OSISeg在遥感图像标注任务中显著加速样本标注的潜力,使其成为遥感领域中有价值的样本标注工具。代码可从https://github.com/zhilyzhang/OSISeg获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
International journal of applied earth observation and geoinformation : ITC journal
International journal of applied earth observation and geoinformation : ITC journal Global and Planetary Change, Management, Monitoring, Policy and Law, Earth-Surface Processes, Computers in Earth Sciences
CiteScore
12.00
自引率
0.00%
发文量
0
审稿时长
77 days
期刊介绍: The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.
期刊最新文献
A new sparse reconstruction framework integrating parallel photography with laser-coplanar directional control for constrained underground spaces Disturbance history and canopy structure drive growing season albedo dynamics in boreal forests Seeing through the noise: A cross-modal guided framework for hyperspectral image classification under multi-type degradations EL-Mamba: An edge-aware and locally-aggregated Mamba network for building height estimation using Sentinel-1 and Sentinel-2 imagery Machine learning-based high resolution spatial economic modeling of biomass energy potential in Southeast Asia
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1