{"title":"基于变压器的新闻摘要比较研究","authors":"Ambrish Choudhary, Mamatha Alugubelly, Rupal Bhargava","doi":"10.1109/DeSE58274.2023.10099798","DOIUrl":null,"url":null,"abstract":"News articles play a crucial role in helping humans know about many important events, developments, and inventions worldwide. The busy chores of our day-to-day life have made it quite challenging to consume important information from lengthy news articles. Therefore, short summaries of news articles are not only crucial but essential as well. Deep learning has revolutionized the field of natural language processing research. A lot of research has been done using pre-trained transformer-based models, and it has significantly improved the text sum-marization performance. In this paper, efforts were made to analyze transformer-based models such as BERT, GPT-2, XL Net, BART, and T5 for extractive and abstractive summarizations. This research investigates various methods through observation and experimentation. It also proposes methods that produce better summaries than comparable methods.","PeriodicalId":346847,"journal":{"name":"2023 15th International Conference on Developments in eSystems Engineering (DeSE)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Comparative Study on Transformer-based News Summarization\",\"authors\":\"Ambrish Choudhary, Mamatha Alugubelly, Rupal Bhargava\",\"doi\":\"10.1109/DeSE58274.2023.10099798\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"News articles play a crucial role in helping humans know about many important events, developments, and inventions worldwide. The busy chores of our day-to-day life have made it quite challenging to consume important information from lengthy news articles. Therefore, short summaries of news articles are not only crucial but essential as well. Deep learning has revolutionized the field of natural language processing research. A lot of research has been done using pre-trained transformer-based models, and it has significantly improved the text sum-marization performance. In this paper, efforts were made to analyze transformer-based models such as BERT, GPT-2, XL Net, BART, and T5 for extractive and abstractive summarizations. This research investigates various methods through observation and experimentation. It also proposes methods that produce better summaries than comparable methods.\",\"PeriodicalId\":346847,\"journal\":{\"name\":\"2023 15th International Conference on Developments in eSystems Engineering (DeSE)\",\"volume\":\"12 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 15th International Conference on Developments in eSystems Engineering (DeSE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DeSE58274.2023.10099798\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 15th International Conference on Developments in eSystems Engineering (DeSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DeSE58274.2023.10099798","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Comparative Study on Transformer-based News Summarization
News articles play a crucial role in helping humans know about many important events, developments, and inventions worldwide. The busy chores of our day-to-day life have made it quite challenging to consume important information from lengthy news articles. Therefore, short summaries of news articles are not only crucial but essential as well. Deep learning has revolutionized the field of natural language processing research. A lot of research has been done using pre-trained transformer-based models, and it has significantly improved the text sum-marization performance. In this paper, efforts were made to analyze transformer-based models such as BERT, GPT-2, XL Net, BART, and T5 for extractive and abstractive summarizations. This research investigates various methods through observation and experimentation. It also proposes methods that produce better summaries than comparable methods.