Weiguo Lu , Xuan Wu , Deng Ding , Jinqiao Duan , Jirong Zhuang , Gangnan Yuan
{"title":"Diffusion model conditioning on Gaussian mixture model and negative Gaussian mixture gradient","authors":"Weiguo Lu , Xuan Wu , Deng Ding , Jinqiao Duan , Jirong Zhuang , Gangnan Yuan","doi":"10.1016/j.neucom.2024.128764","DOIUrl":null,"url":null,"abstract":"<div><div>Diffusion models (DMs) are a type of generative model that has had a significant impact on image synthesis and beyond. They can incorporate a wide variety of conditioning inputs — such as text or bounding boxes — to guide generation. In this work, we introduce a novel conditioning mechanism that applies Gaussian mixture models (GMMs) for feature conditioning, which helps steer the denoising process in DMs. Drawing on set theory, our comprehensive theoretical analysis reveals that the conditional latent distribution based on features differs markedly from that based on classes. Consequently, feature-based conditioning tends to generate fewer defects than class-based conditioning. Experiments are designed and carried out and the experimental results support our theoretical findings as well as effectiveness of proposed feature conditioning mechanism. Additionally, we propose a new gradient function named the Negative Gaussian Mixture Gradient (NGMG) and incorporate it into the training of diffusion models alongside an auxiliary classifier. We theoretically demonstrate that NGMG offers comparable advantages to the Wasserstein distance, serving as a more effective cost function when learning distributions supported by low-dimensional manifolds, especially in contrast to many likelihood-based cost functions, such as KL divergences.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":null,"pages":null},"PeriodicalIF":5.5000,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224015352","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Diffusion models (DMs) are a type of generative model that has had a significant impact on image synthesis and beyond. They can incorporate a wide variety of conditioning inputs — such as text or bounding boxes — to guide generation. In this work, we introduce a novel conditioning mechanism that applies Gaussian mixture models (GMMs) for feature conditioning, which helps steer the denoising process in DMs. Drawing on set theory, our comprehensive theoretical analysis reveals that the conditional latent distribution based on features differs markedly from that based on classes. Consequently, feature-based conditioning tends to generate fewer defects than class-based conditioning. Experiments are designed and carried out and the experimental results support our theoretical findings as well as effectiveness of proposed feature conditioning mechanism. Additionally, we propose a new gradient function named the Negative Gaussian Mixture Gradient (NGMG) and incorporate it into the training of diffusion models alongside an auxiliary classifier. We theoretically demonstrate that NGMG offers comparable advantages to the Wasserstein distance, serving as a more effective cost function when learning distributions supported by low-dimensional manifolds, especially in contrast to many likelihood-based cost functions, such as KL divergences.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.