Jinming Cao, Oren Katzir, Peng Jiang, D. Lischinski, D. Cohen-Or, Changhe Tu, Yangyan Li
{"title":"DiDA: Iterative Boosting of Disentangled Synthesis and Domain Adaptation","authors":"Jinming Cao, Oren Katzir, Peng Jiang, D. Lischinski, D. Cohen-Or, Changhe Tu, Yangyan Li","doi":"10.1109/ITME53901.2021.00049","DOIUrl":null,"url":null,"abstract":"Unsupervised domain adaptation aims at learning a shared model for two related domains by leveraging supervision from a source domain to an unsupervised target domain. A number of effective domain adaptation approaches rely on the ability to extract domain-invariant latent factors which are common to both domains. Extracting latent commonality is also useful for disentanglement analysis. It enables separation between the common and the domain-specific features of both domains, which can be recombined for synthesis. In this paper, we propose a strategy to boost the performance of domain adaptation and disentangled synthesis iteratively. The key idea is that by learning to separately extract both the common and the domain-specific features, one can synthesize more target domain data with supervision, thereby boosting the domain adaptation performance. Better common feature extraction, in turn, helps further improve the feature disentanglement and the following disentangled synthesis. We show that iterating between domain adaptation and disentangled synthesis can consistently improve each other on several unsupervised domain adaptation benchmark datasets and tasks, under various domain adaptation backbone models.","PeriodicalId":6774,"journal":{"name":"2021 11th International Conference on Information Technology in Medicine and Education (ITME)","volume":"29 1","pages":"201-208"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 11th International Conference on Information Technology in Medicine and Education (ITME)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITME53901.2021.00049","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Unsupervised domain adaptation aims at learning a shared model for two related domains by leveraging supervision from a source domain to an unsupervised target domain. A number of effective domain adaptation approaches rely on the ability to extract domain-invariant latent factors which are common to both domains. Extracting latent commonality is also useful for disentanglement analysis. It enables separation between the common and the domain-specific features of both domains, which can be recombined for synthesis. In this paper, we propose a strategy to boost the performance of domain adaptation and disentangled synthesis iteratively. The key idea is that by learning to separately extract both the common and the domain-specific features, one can synthesize more target domain data with supervision, thereby boosting the domain adaptation performance. Better common feature extraction, in turn, helps further improve the feature disentanglement and the following disentangled synthesis. We show that iterating between domain adaptation and disentangled synthesis can consistently improve each other on several unsupervised domain adaptation benchmark datasets and tasks, under various domain adaptation backbone models.