深度迁移学习的平衡联合最大平均差异

IF 2 2区 数学 Q1 MATHEMATICS Analysis and Applications Pub Date : 2020-05-27 DOI:10.1142/s0219530520400035
Chuangji Meng, Cunlu Xu, Qin Lei, W. Su, Jinzhao Wu
{"title":"深度迁移学习的平衡联合最大平均差异","authors":"Chuangji Meng, Cunlu Xu, Qin Lei, W. Su, Jinzhao Wu","doi":"10.1142/s0219530520400035","DOIUrl":null,"url":null,"abstract":"Recent studies have revealed that deep networks can learn transferable features that generalize well to novel tasks with little or unavailable labeled data for domain adaptation. However, justifying which components of the feature representations can reason about original joint distributions using JMMD within the regime of deep architecture remains unclear. We present a new backpropagation algorithm for JMMD called the Balanced Joint Maximum Mean Discrepancy (B-JMMD) to further reduce the domain discrepancy. B-JMMD achieves the effect of balanced distribution adaptation for deep network architecture, and can be treated as an improved version of JMMD’s backpropagation algorithm. The proposed method leverages the importance of marginal and conditional distributions behind multiple domain-specific layers across domains adaptively to get a good match for the joint distributions in a second-order reproducing kernel Hilbert space. The learning of the proposed method can be performed technically by a special form of stochastic gradient descent, in which the gradient is computed by backpropagation with a strategy of balanced distribution adaptation. Theoretical analysis shows that the proposed B-JMMD is superior to JMMD method. Experiments confirm that our method yields state-of-the-art results with standard datasets.","PeriodicalId":55519,"journal":{"name":"Analysis and Applications","volume":" ","pages":""},"PeriodicalIF":2.0000,"publicationDate":"2020-05-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1142/s0219530520400035","citationCount":"3","resultStr":"{\"title\":\"Balanced joint maximum mean discrepancy for deep transfer learning\",\"authors\":\"Chuangji Meng, Cunlu Xu, Qin Lei, W. Su, Jinzhao Wu\",\"doi\":\"10.1142/s0219530520400035\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recent studies have revealed that deep networks can learn transferable features that generalize well to novel tasks with little or unavailable labeled data for domain adaptation. However, justifying which components of the feature representations can reason about original joint distributions using JMMD within the regime of deep architecture remains unclear. We present a new backpropagation algorithm for JMMD called the Balanced Joint Maximum Mean Discrepancy (B-JMMD) to further reduce the domain discrepancy. B-JMMD achieves the effect of balanced distribution adaptation for deep network architecture, and can be treated as an improved version of JMMD’s backpropagation algorithm. The proposed method leverages the importance of marginal and conditional distributions behind multiple domain-specific layers across domains adaptively to get a good match for the joint distributions in a second-order reproducing kernel Hilbert space. The learning of the proposed method can be performed technically by a special form of stochastic gradient descent, in which the gradient is computed by backpropagation with a strategy of balanced distribution adaptation. Theoretical analysis shows that the proposed B-JMMD is superior to JMMD method. Experiments confirm that our method yields state-of-the-art results with standard datasets.\",\"PeriodicalId\":55519,\"journal\":{\"name\":\"Analysis and Applications\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2020-05-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1142/s0219530520400035\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Analysis and Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1142/s0219530520400035\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Analysis and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1142/s0219530520400035","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 3

摘要

最近的研究表明,深度网络可以学习可转移的特征,这些特征可以很好地推广到具有少量或不可用标记数据的新任务中,以进行领域适应。然而,在深度体系结构的范围内,使用JMMD的特征表示的哪些组件可以推断出原始联合分布的合理性仍然不清楚。为了进一步减小域差异,我们提出了一种新的JMMD反向传播算法,称为平衡联合最大平均差异(B-JMMD)。B-JMMD实现了深度网络体系结构均衡分布适应的效果,可以看作是JMMD反向传播算法的改进版本。该方法自适应地利用跨域多个特定域层后的边缘分布和条件分布的重要性,得到二阶再现核希尔伯特空间中联合分布的良好匹配。该方法的学习可以通过一种特殊形式的随机梯度下降来完成,其中梯度是通过平衡分布自适应策略的反向传播来计算的。理论分析表明,B-JMMD方法优于JMMD方法。实验证实,我们的方法产生最先进的结果与标准数据集。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Balanced joint maximum mean discrepancy for deep transfer learning
Recent studies have revealed that deep networks can learn transferable features that generalize well to novel tasks with little or unavailable labeled data for domain adaptation. However, justifying which components of the feature representations can reason about original joint distributions using JMMD within the regime of deep architecture remains unclear. We present a new backpropagation algorithm for JMMD called the Balanced Joint Maximum Mean Discrepancy (B-JMMD) to further reduce the domain discrepancy. B-JMMD achieves the effect of balanced distribution adaptation for deep network architecture, and can be treated as an improved version of JMMD’s backpropagation algorithm. The proposed method leverages the importance of marginal and conditional distributions behind multiple domain-specific layers across domains adaptively to get a good match for the joint distributions in a second-order reproducing kernel Hilbert space. The learning of the proposed method can be performed technically by a special form of stochastic gradient descent, in which the gradient is computed by backpropagation with a strategy of balanced distribution adaptation. Theoretical analysis shows that the proposed B-JMMD is superior to JMMD method. Experiments confirm that our method yields state-of-the-art results with standard datasets.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.90
自引率
4.50%
发文量
29
审稿时长
>12 weeks
期刊介绍: Analysis and Applications publishes high quality mathematical papers that treat those parts of analysis which have direct or potential applications to the physical and biological sciences and engineering. Some of the topics from analysis include approximation theory, asymptotic analysis, calculus of variations, integral equations, integral transforms, ordinary and partial differential equations, delay differential equations, and perturbation methods. The primary aim of the journal is to encourage the development of new techniques and results in applied analysis.
期刊最新文献
On the strong solution for a diffuse interface model of non-Newtonian two-phase flows Distributed SGD in Overparameterized Linear Regression Interpolatory Taylor and Lidstone series Author index Volume 21 (2023) Convergence Analysis of Deep Residual Networks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1