{"title":"针对领域适应的联合边际和中心样本学习","authors":"Shaohua Teng, Wenjie Liu, Luyao Teng, Zefeng Zheng, Wei Zhang","doi":"10.1007/s11280-024-01290-3","DOIUrl":null,"url":null,"abstract":"<p>Domain adaptation aims to alleviate the impact of distribution differences when migrating knowledge from the source domain to the target domain. However, two issues remain to be addressed. One is the difficulty of learning both marginal and specific knowledge at the same time. The other is the low quality of pseudo labels in target domain can constrain the performance improvement during model iteration. To solve the above problems, we propose a domain adaptation method called Joint Marginal and Central Sample Learning (JMCSL). This method consists of three parts which are marginal sample learning (MSL), central sample learning (CSL) and unified strategy for multi-classifier (USMC). MSL and CSL aim to better learning of common and specific knowledge. USMC improves the accuracy and stability of pseudo labels in the target domain. Specifically, MSL learns specific knowledge from a novel triple distance, which is defined by sample pair and their class center. CSL uses the closest class center and the second closest class center of samples to retain the common knowledge. USMC selects label consistent samples by applying K-Nearest Neighbors (KNN) and Structural Risk Minimization (SRM), while it utilizes the class centers of both two domains for classification. Finally, extensive experiments on four visual datasets demonstrate that JMCSL is superior to other competing methods.</p>","PeriodicalId":501180,"journal":{"name":"World Wide Web","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Joint marginal and central sample learning for domain adaptation\",\"authors\":\"Shaohua Teng, Wenjie Liu, Luyao Teng, Zefeng Zheng, Wei Zhang\",\"doi\":\"10.1007/s11280-024-01290-3\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Domain adaptation aims to alleviate the impact of distribution differences when migrating knowledge from the source domain to the target domain. However, two issues remain to be addressed. One is the difficulty of learning both marginal and specific knowledge at the same time. The other is the low quality of pseudo labels in target domain can constrain the performance improvement during model iteration. To solve the above problems, we propose a domain adaptation method called Joint Marginal and Central Sample Learning (JMCSL). This method consists of three parts which are marginal sample learning (MSL), central sample learning (CSL) and unified strategy for multi-classifier (USMC). MSL and CSL aim to better learning of common and specific knowledge. USMC improves the accuracy and stability of pseudo labels in the target domain. Specifically, MSL learns specific knowledge from a novel triple distance, which is defined by sample pair and their class center. CSL uses the closest class center and the second closest class center of samples to retain the common knowledge. USMC selects label consistent samples by applying K-Nearest Neighbors (KNN) and Structural Risk Minimization (SRM), while it utilizes the class centers of both two domains for classification. Finally, extensive experiments on four visual datasets demonstrate that JMCSL is superior to other competing methods.</p>\",\"PeriodicalId\":501180,\"journal\":{\"name\":\"World Wide Web\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"World Wide Web\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s11280-024-01290-3\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"World Wide Web","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s11280-024-01290-3","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Joint marginal and central sample learning for domain adaptation
Domain adaptation aims to alleviate the impact of distribution differences when migrating knowledge from the source domain to the target domain. However, two issues remain to be addressed. One is the difficulty of learning both marginal and specific knowledge at the same time. The other is the low quality of pseudo labels in target domain can constrain the performance improvement during model iteration. To solve the above problems, we propose a domain adaptation method called Joint Marginal and Central Sample Learning (JMCSL). This method consists of three parts which are marginal sample learning (MSL), central sample learning (CSL) and unified strategy for multi-classifier (USMC). MSL and CSL aim to better learning of common and specific knowledge. USMC improves the accuracy and stability of pseudo labels in the target domain. Specifically, MSL learns specific knowledge from a novel triple distance, which is defined by sample pair and their class center. CSL uses the closest class center and the second closest class center of samples to retain the common knowledge. USMC selects label consistent samples by applying K-Nearest Neighbors (KNN) and Structural Risk Minimization (SRM), while it utilizes the class centers of both two domains for classification. Finally, extensive experiments on four visual datasets demonstrate that JMCSL is superior to other competing methods.