{"title":"tBART:基于主题建模和BART结合的抽象摘要","authors":"Binh Dang, Dinh-Truong Do, Le-Minh Nguyen","doi":"10.1109/KSE56063.2022.9953613","DOIUrl":null,"url":null,"abstract":"Topic information has been helpful to direct semantics in text summarization. In this paper, we present a study on a novel and efficient method to incorporate the topic information with the BART model for abstractive summarization, called the tBART. The proposed model inherits the advantages of the BART, learns latent topics, and transfers the topic vector of tokens to context space by an align function. The experimental results illustrate the effectiveness of our proposed method, which significantly outperforms previous methods on two benchmark datasets: XSUM and CNN/DAILY MAIL.","PeriodicalId":330865,"journal":{"name":"2022 14th International Conference on Knowledge and Systems Engineering (KSE)","volume":"78 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"tBART: Abstractive summarization based on the joining of Topic modeling and BART\",\"authors\":\"Binh Dang, Dinh-Truong Do, Le-Minh Nguyen\",\"doi\":\"10.1109/KSE56063.2022.9953613\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Topic information has been helpful to direct semantics in text summarization. In this paper, we present a study on a novel and efficient method to incorporate the topic information with the BART model for abstractive summarization, called the tBART. The proposed model inherits the advantages of the BART, learns latent topics, and transfers the topic vector of tokens to context space by an align function. The experimental results illustrate the effectiveness of our proposed method, which significantly outperforms previous methods on two benchmark datasets: XSUM and CNN/DAILY MAIL.\",\"PeriodicalId\":330865,\"journal\":{\"name\":\"2022 14th International Conference on Knowledge and Systems Engineering (KSE)\",\"volume\":\"78 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 14th International Conference on Knowledge and Systems Engineering (KSE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/KSE56063.2022.9953613\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 14th International Conference on Knowledge and Systems Engineering (KSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/KSE56063.2022.9953613","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
tBART: Abstractive summarization based on the joining of Topic modeling and BART
Topic information has been helpful to direct semantics in text summarization. In this paper, we present a study on a novel and efficient method to incorporate the topic information with the BART model for abstractive summarization, called the tBART. The proposed model inherits the advantages of the BART, learns latent topics, and transfers the topic vector of tokens to context space by an align function. The experimental results illustrate the effectiveness of our proposed method, which significantly outperforms previous methods on two benchmark datasets: XSUM and CNN/DAILY MAIL.