{"title":"探索 LSTM 参数对自动文本摘要网络性能的影响","authors":"R. Naaz, K. R, Surendra Yadav","doi":"10.1109/ICOCWC60930.2024.10470615","DOIUrl":null,"url":null,"abstract":"This technical abstract explores the impact of extended quick-term reminiscence (LSTM) parameters on net overall performance for automated textual content summarization within the Korean language. It observes and considers the parameters of phrase embedding size, sentence length, and encoding intensity. Embedding length substantially affects the community's overall performance, and a pair of dimensional representations of word embedding can improve summarization accuracy. Increasing sentence duration additionally showed enhancements, with the very best accuracy executed at triple sentence embedding lengths. Eventually, encoding intensity had a low effect on network performance, with only barely better results visible with double and triple encodings. Overall, this observation concluded that gold standard network performance for textual content summarization in the Korean language is pleasant and finished via a mixture of two-dimensional embedding with an elevated sentence length and unmarried encoding intensity.","PeriodicalId":518901,"journal":{"name":"2024 International Conference on Optimization Computing and Wireless Communication (ICOCWC)","volume":"113 ","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Exploring the Impact of LSTM Parameters on Network Performance for Automatic Text Summarization\",\"authors\":\"R. Naaz, K. R, Surendra Yadav\",\"doi\":\"10.1109/ICOCWC60930.2024.10470615\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This technical abstract explores the impact of extended quick-term reminiscence (LSTM) parameters on net overall performance for automated textual content summarization within the Korean language. It observes and considers the parameters of phrase embedding size, sentence length, and encoding intensity. Embedding length substantially affects the community's overall performance, and a pair of dimensional representations of word embedding can improve summarization accuracy. Increasing sentence duration additionally showed enhancements, with the very best accuracy executed at triple sentence embedding lengths. Eventually, encoding intensity had a low effect on network performance, with only barely better results visible with double and triple encodings. Overall, this observation concluded that gold standard network performance for textual content summarization in the Korean language is pleasant and finished via a mixture of two-dimensional embedding with an elevated sentence length and unmarried encoding intensity.\",\"PeriodicalId\":518901,\"journal\":{\"name\":\"2024 International Conference on Optimization Computing and Wireless Communication (ICOCWC)\",\"volume\":\"113 \",\"pages\":\"1-5\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-01-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2024 International Conference on Optimization Computing and Wireless Communication (ICOCWC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICOCWC60930.2024.10470615\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2024 International Conference on Optimization Computing and Wireless Communication (ICOCWC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICOCWC60930.2024.10470615","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Exploring the Impact of LSTM Parameters on Network Performance for Automatic Text Summarization
This technical abstract explores the impact of extended quick-term reminiscence (LSTM) parameters on net overall performance for automated textual content summarization within the Korean language. It observes and considers the parameters of phrase embedding size, sentence length, and encoding intensity. Embedding length substantially affects the community's overall performance, and a pair of dimensional representations of word embedding can improve summarization accuracy. Increasing sentence duration additionally showed enhancements, with the very best accuracy executed at triple sentence embedding lengths. Eventually, encoding intensity had a low effect on network performance, with only barely better results visible with double and triple encodings. Overall, this observation concluded that gold standard network performance for textual content summarization in the Korean language is pleasant and finished via a mixture of two-dimensional embedding with an elevated sentence length and unmarried encoding intensity.