Adaptive Discrepancy Masked Distillation for remote sensing object detection

IF 10.6 1区 地球科学 Q1 GEOGRAPHY, PHYSICAL ISPRS Journal of Photogrammetry and Remote Sensing Pub Date : 2025-02-26 DOI:10.1016/j.isprsjprs.2025.02.006
Cong Li, Gong Cheng, Junwei Han
{"title":"Adaptive Discrepancy Masked Distillation for remote sensing object detection","authors":"Cong Li,&nbsp;Gong Cheng,&nbsp;Junwei Han","doi":"10.1016/j.isprsjprs.2025.02.006","DOIUrl":null,"url":null,"abstract":"<div><div>Knowledge distillation (KD) has become a promising technique for obtaining a performant student detector in remote sensing images by inheriting the knowledge from a heavy teacher detector. Unfortunately, not every pixel contributes (even detrimental) equally to the final KD performance. To dispel this problem, the existing methods usually derived a distillation mask to stress the valuable regions during KD. In this paper, we put forth Adaptive Discrepancy Masked Distillation (ADMD), a novel KD framework to explicitly localize the beneficial pixels. Our approach stems from the observation that the feature discrepancy between the teacher and student is the essential reason for their performance gap. With this regard, we make use of the feature discrepancy to determine which location causes the student to lag behind the teacher and then regulate the student to assign higher learning priority to them. Furthermore, we empirically observe that the discrepancy masked distillation leads to loss vanishing in later KD stages. To combat this issue, we introduce a simple yet practical weight-increasing module, in which the magnitude of KD loss is adaptively adjusted to ensure KD steadily contributes to student optimization. Comprehensive experiments on DIOR and DOTA across various dense detectors show that our ADMD consistently harvests remarkable performance gains, particularly under a prolonged distillation schedule, and exhibits superiority over state-of-the-art counterparts. Code and trained checkpoints will be made available at <span><span>https://github.com/swift1988</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"222 ","pages":"Pages 54-63"},"PeriodicalIF":10.6000,"publicationDate":"2025-02-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625000565","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Knowledge distillation (KD) has become a promising technique for obtaining a performant student detector in remote sensing images by inheriting the knowledge from a heavy teacher detector. Unfortunately, not every pixel contributes (even detrimental) equally to the final KD performance. To dispel this problem, the existing methods usually derived a distillation mask to stress the valuable regions during KD. In this paper, we put forth Adaptive Discrepancy Masked Distillation (ADMD), a novel KD framework to explicitly localize the beneficial pixels. Our approach stems from the observation that the feature discrepancy between the teacher and student is the essential reason for their performance gap. With this regard, we make use of the feature discrepancy to determine which location causes the student to lag behind the teacher and then regulate the student to assign higher learning priority to them. Furthermore, we empirically observe that the discrepancy masked distillation leads to loss vanishing in later KD stages. To combat this issue, we introduce a simple yet practical weight-increasing module, in which the magnitude of KD loss is adaptively adjusted to ensure KD steadily contributes to student optimization. Comprehensive experiments on DIOR and DOTA across various dense detectors show that our ADMD consistently harvests remarkable performance gains, particularly under a prolonged distillation schedule, and exhibits superiority over state-of-the-art counterparts. Code and trained checkpoints will be made available at https://github.com/swift1988.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
知识蒸馏(KD)是一种很有前途的技术,它通过继承大量教师检测器的知识,在遥感图像中获得性能良好的学生检测器。遗憾的是,并非每个像素都能对最终的 KD 性能做出同样的贡献(甚至是有害的)。为了解决这个问题,现有的方法通常会在 KD 过程中衍生出一个蒸馏掩码来强调有价值的区域。在本文中,我们提出了一种新颖的 KD 框架--自适应差异屏蔽蒸馏(ADMD),以明确定位有益像素。我们的方法源于我们的观察,即教师和学生之间的特征差异是导致他们成绩差距的根本原因。因此,我们利用特征差异来确定哪个位置导致学生落后于教师,然后对学生进行调节,为其分配更高的学习优先级。此外,我们还通过实证观察发现,差异掩蔽提炼会导致后期 KD 阶段的损失消失。为了解决这个问题,我们引入了一个简单而实用的权重增加模块,在这个模块中,KD 损失的大小会进行自适应调整,以确保 KD 稳步促进学生的优化。在 DIOR 和 DOTA 上对各种密集检测器进行的综合实验表明,我们的 ADMD 持续获得了显著的性能提升,尤其是在延长蒸馏时间的情况下,并显示出优于最先进同行的性能。代码和训练有素的检查点将发布在 https://github.com/swift1988 上。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
相关文献
Unpacking the intention to action gap: a qualitative study understanding how physicians engage with audit and feedback.
IF 7.2 2区 医学ACS Biomaterials Science & EngineeringPub Date : 2021-02-17 DOI: 10.1186/s13012-021-01088-1
Laura Desveaux, Noah Michael Ivers, Kim Devotta, Noor Ramji, Karen Weyman, Tara Kiran
Creating wellness apps with high patient engagement to close the intention-action gap.
IF 4.6 2区 医学International psychogeriatricsPub Date : 2021-06-01 DOI: 10.1017/S1041610220003385
Samantha Boardman
来源期刊
ISPRS Journal of Photogrammetry and Remote Sensing
ISPRS Journal of Photogrammetry and Remote Sensing 工程技术-成像科学与照相技术
CiteScore
21.00
自引率
6.30%
发文量
273
审稿时长
40 days
期刊介绍: The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive. P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields. In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.
期刊最新文献
Efficient metric-resolution land cover mapping using open-access low resolution annotations with prototype learning and modified Segment Anything model DiffSARShipInst: Diffusion model for ship instance segmentation from synthetic aperture radar imagery An nD-histogram technique for querying non-uniformly distributed point cloud data Global patterns and determinants of year-to-year variations in surface urban heat islands ICESat-2 bathymetry algorithms: A review of the current state-of-the-art and future outlook
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1