Pointer over Attention: An Improved Bangla Text Summarization Approach Using Hybrid Pointer Generator Network

Nobel Dhar, Gaurob Saha, Prithwiraj Bhattacharjee, Avi Mallick, Md. Saiful Islam
{"title":"Pointer over Attention: An Improved Bangla Text Summarization Approach Using Hybrid Pointer Generator Network","authors":"Nobel Dhar, Gaurob Saha, Prithwiraj Bhattacharjee, Avi Mallick, Md. Saiful Islam","doi":"10.1109/ICCIT54785.2021.9689852","DOIUrl":null,"url":null,"abstract":"Despite the success of the neural sequence-to-sequence model for abstractive text summarization, it has a few shortcomings, such as repeating inaccurate factual details and tending to repeat themselves. We propose a hybrid pointer generator network to solve the shortcomings of reproducing factual details inadequately and phrase repetition. We augment the attention-based sequence-to-sequence using a hybrid pointer generator network that can generate Out-of-Vocabulary words and enhance accuracy in reproducing authentic details and a coverage mechanism that discourages repetition. It produces a reasonable-sized output text that preserves the conceptual integrity and factual information of the input article. For evaluation, we primarily employed “BANSData”1 - a highly adopted publicly available Bengali dataset. Additionally, we prepared a large-scale dataset called “BANS-133” which consists of 133k Bangla news articles associated with human-generated summaries. Experimenting with the proposed model, we achieved ROUGE-1 and ROUGE-2 scores of 0.66, 0.41 for the BANSData” dataset and 0.67, 0.42 for the BANS-133k” dataset, respectively. We demonstrated that the proposed system surpasses previous state-of-the-art Bengali abstractive summarization techniques and its stability on a larger dataset. “BANS-133” datasets and code-base will be publicly available for research.","PeriodicalId":166450,"journal":{"name":"2021 24th International Conference on Computer and Information Technology (ICCIT)","volume":"282 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 24th International Conference on Computer and Information Technology (ICCIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCIT54785.2021.9689852","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Despite the success of the neural sequence-to-sequence model for abstractive text summarization, it has a few shortcomings, such as repeating inaccurate factual details and tending to repeat themselves. We propose a hybrid pointer generator network to solve the shortcomings of reproducing factual details inadequately and phrase repetition. We augment the attention-based sequence-to-sequence using a hybrid pointer generator network that can generate Out-of-Vocabulary words and enhance accuracy in reproducing authentic details and a coverage mechanism that discourages repetition. It produces a reasonable-sized output text that preserves the conceptual integrity and factual information of the input article. For evaluation, we primarily employed “BANSData”1 - a highly adopted publicly available Bengali dataset. Additionally, we prepared a large-scale dataset called “BANS-133” which consists of 133k Bangla news articles associated with human-generated summaries. Experimenting with the proposed model, we achieved ROUGE-1 and ROUGE-2 scores of 0.66, 0.41 for the BANSData” dataset and 0.67, 0.42 for the BANS-133k” dataset, respectively. We demonstrated that the proposed system surpasses previous state-of-the-art Bengali abstractive summarization techniques and its stability on a larger dataset. “BANS-133” datasets and code-base will be publicly available for research.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
指针高于注意力:一种使用混合指针生成器网络的改进孟加拉语文本摘要方法
尽管神经序列到序列模型在抽象文本摘要方面取得了成功,但它也有一些缺点,如重复不准确的事实细节和倾向于重复自己。我们提出了一种混合指针生成器网络,以解决事实细节再现不足和短语重复的缺点。我们使用混合指针生成器网络增强基于注意力的序列到序列,该网络可以生成词汇外的单词,并提高再现真实细节的准确性,以及阻止重复的覆盖机制。它生成一个合理大小的输出文本,保留输入条目的概念完整性和事实信息。为了进行评估,我们主要使用了“BANSData”1——一个高度采用的公开可用的孟加拉语数据集。此外,我们准备了一个名为“ban -133”的大型数据集,其中包含133k篇与人工生成摘要相关的孟加拉语新闻文章。通过对该模型的实验,我们获得了“BANSData”数据集的ROUGE-1和ROUGE-2分数分别为0.66、0.41和0.67、0.42。我们证明了所提出的系统超越了以前最先进的孟加拉语抽象摘要技术及其在更大数据集上的稳定性。“ban -133”数据集和代码库将公开供研究使用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
The Eigenvalue Distribution of Hankel Matrix: A Tool for Spectral Estimation From Noisy Data Demystify the Black-box of Deep Learning Models for COVID-19 Detection from Chest CT Radiographs Machine Learning Techniques to Precaution of Emerging Disease in the Poultry Industry A Framework for Multi-party Skyline Query Maintaining Privacy and Data Integrity Application of Feature based Face Detection in Adaptive Skin Pixel Identification Using Signal Processing Techniques
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1