{"title":"Incorporate Maximum Mean Discrepancy in Recurrent Latent Space for Sequential Generative Model","authors":"Yuchi Zhang, Yongliang Wang, Yang Dong","doi":"10.1109/ICASSP39728.2021.9414580","DOIUrl":null,"url":null,"abstract":"Stochastic recurrent neural networks have shown promising performance for modeling complex sequences. Nonetheless, existing methods adopt KL divergence as distribution regularizations in their latent spaces, which limits the choices of models for latent distribution construction. In this paper, we incorporate maximum mean discrepancy in the recurrent structure for distribution regularization. Maximum mean discrepancy is able to measure the difference between two distributions by just sampling from them, which enables us to construct more complicated latent distributions by neural networks. Therefore, our proposed algorithm is able to model more complex sequences. Experiments conducted on two different sequential modeling tasks show that our method outperforms the state-of-the-art sequential modeling algorithms.","PeriodicalId":347060,"journal":{"name":"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSP39728.2021.9414580","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Stochastic recurrent neural networks have shown promising performance for modeling complex sequences. Nonetheless, existing methods adopt KL divergence as distribution regularizations in their latent spaces, which limits the choices of models for latent distribution construction. In this paper, we incorporate maximum mean discrepancy in the recurrent structure for distribution regularization. Maximum mean discrepancy is able to measure the difference between two distributions by just sampling from them, which enables us to construct more complicated latent distributions by neural networks. Therefore, our proposed algorithm is able to model more complex sequences. Experiments conducted on two different sequential modeling tasks show that our method outperforms the state-of-the-art sequential modeling algorithms.