J. Niu, Huanpei Chen, Qingjuan Zhao, Limin Su, Mohammed Atiquzzaman
{"title":"基于块图和递归神经网络的多文档抽象摘要","authors":"J. Niu, Huanpei Chen, Qingjuan Zhao, Limin Su, Mohammed Atiquzzaman","doi":"10.1109/ICC.2017.7996331","DOIUrl":null,"url":null,"abstract":"Automatic multi-document abstractive summarization system is used to summarize several documents into a short one with generated new sentences. Many of them are based on word-graph and ILP method, and lots of sentences are ignored because of the heavy computation load. To reduce computation and generate readable and informative summaries, we propose a novel abstractive multi-document summarization system based on chunk-graph (CG) and recurrent neural network language model (RNNLM). In our approach, A CG which is based on word-graph is constructed to organize all information in a sentence cluster, CG can reduce the size of graph and keep more semantic information than word-graph. We use beam search and character-level RNNLM to generate readable and informative summaries from the CG for each sentence cluster, RNNLM is a better model to evaluate sentence linguistic quality than n-gram language model. Experimental results show that our proposed system outperforms all baseline systems and reach the state-of-art systems, and the system with CG can generate better summaries than that with ordinary word-graph.","PeriodicalId":6517,"journal":{"name":"2017 IEEE International Conference on Communications (ICC)","volume":"36 1","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2017-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":"{\"title\":\"Multi-document abstractive summarization using chunk-graph and recurrent neural network\",\"authors\":\"J. Niu, Huanpei Chen, Qingjuan Zhao, Limin Su, Mohammed Atiquzzaman\",\"doi\":\"10.1109/ICC.2017.7996331\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Automatic multi-document abstractive summarization system is used to summarize several documents into a short one with generated new sentences. Many of them are based on word-graph and ILP method, and lots of sentences are ignored because of the heavy computation load. To reduce computation and generate readable and informative summaries, we propose a novel abstractive multi-document summarization system based on chunk-graph (CG) and recurrent neural network language model (RNNLM). In our approach, A CG which is based on word-graph is constructed to organize all information in a sentence cluster, CG can reduce the size of graph and keep more semantic information than word-graph. We use beam search and character-level RNNLM to generate readable and informative summaries from the CG for each sentence cluster, RNNLM is a better model to evaluate sentence linguistic quality than n-gram language model. Experimental results show that our proposed system outperforms all baseline systems and reach the state-of-art systems, and the system with CG can generate better summaries than that with ordinary word-graph.\",\"PeriodicalId\":6517,\"journal\":{\"name\":\"2017 IEEE International Conference on Communications (ICC)\",\"volume\":\"36 1\",\"pages\":\"1-6\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-05-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"15\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE International Conference on Communications (ICC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICC.2017.7996331\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE International Conference on Communications (ICC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICC.2017.7996331","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Multi-document abstractive summarization using chunk-graph and recurrent neural network
Automatic multi-document abstractive summarization system is used to summarize several documents into a short one with generated new sentences. Many of them are based on word-graph and ILP method, and lots of sentences are ignored because of the heavy computation load. To reduce computation and generate readable and informative summaries, we propose a novel abstractive multi-document summarization system based on chunk-graph (CG) and recurrent neural network language model (RNNLM). In our approach, A CG which is based on word-graph is constructed to organize all information in a sentence cluster, CG can reduce the size of graph and keep more semantic information than word-graph. We use beam search and character-level RNNLM to generate readable and informative summaries from the CG for each sentence cluster, RNNLM is a better model to evaluate sentence linguistic quality than n-gram language model. Experimental results show that our proposed system outperforms all baseline systems and reach the state-of-art systems, and the system with CG can generate better summaries than that with ordinary word-graph.