Chunyun Zhang, Sheng Gao, Baolin Zhao, Lu Yang, Xiaoming Xi, C. Cui, Yilong Yin
{"title":"基于多图CNN的自适应句子表示学习模型","authors":"Chunyun Zhang, Sheng Gao, Baolin Zhao, Lu Yang, Xiaoming Xi, C. Cui, Yilong Yin","doi":"10.1109/IE.2017.18","DOIUrl":null,"url":null,"abstract":"Nature Language Processing has been paid more attention recently. Traditional approaches for language model primarily rely on elaborately designed features and complicated natural language processing tools, which take a large amount of human effort and are prone to error propagation and data sparse problem. Deep neural network method has been shown to be able to learn implicit semantics of text without extra knowledge. To better learn deep underlying semantics of sentences, most deepneuralnetworklanguagemodelsutilizemulti-gramstrategy. However, the current multi-gram strategies in CNN framework are mostly realized by concatenating trained multi-gram vectors to form the sentence vector, which can increase the number of parameters to be learned and is prone to over fitting. To alleviate the problem mentioned above, we propose a novel adaptive sentence representation learning model based on multigram CNN framework. It learns adaptive importance weights of different n-gram features and forms sentence representation by using weighted sum operation on extracted n-gram features, which can largely reduce parameters to be learned and alleviate the threat of over fitting. Experimental results show that the proposed method can improve performances when be used in sentiment and relation classification tasks.","PeriodicalId":306693,"journal":{"name":"2017 International Conference on Intelligent Environments (IE)","volume":"62 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"An Adaptive Sentence Representation Learning Model Based on Multi-gram CNN\",\"authors\":\"Chunyun Zhang, Sheng Gao, Baolin Zhao, Lu Yang, Xiaoming Xi, C. Cui, Yilong Yin\",\"doi\":\"10.1109/IE.2017.18\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Nature Language Processing has been paid more attention recently. Traditional approaches for language model primarily rely on elaborately designed features and complicated natural language processing tools, which take a large amount of human effort and are prone to error propagation and data sparse problem. Deep neural network method has been shown to be able to learn implicit semantics of text without extra knowledge. To better learn deep underlying semantics of sentences, most deepneuralnetworklanguagemodelsutilizemulti-gramstrategy. However, the current multi-gram strategies in CNN framework are mostly realized by concatenating trained multi-gram vectors to form the sentence vector, which can increase the number of parameters to be learned and is prone to over fitting. To alleviate the problem mentioned above, we propose a novel adaptive sentence representation learning model based on multigram CNN framework. It learns adaptive importance weights of different n-gram features and forms sentence representation by using weighted sum operation on extracted n-gram features, which can largely reduce parameters to be learned and alleviate the threat of over fitting. Experimental results show that the proposed method can improve performances when be used in sentiment and relation classification tasks.\",\"PeriodicalId\":306693,\"journal\":{\"name\":\"2017 International Conference on Intelligent Environments (IE)\",\"volume\":\"62 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 International Conference on Intelligent Environments (IE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IE.2017.18\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 International Conference on Intelligent Environments (IE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IE.2017.18","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Adaptive Sentence Representation Learning Model Based on Multi-gram CNN
Nature Language Processing has been paid more attention recently. Traditional approaches for language model primarily rely on elaborately designed features and complicated natural language processing tools, which take a large amount of human effort and are prone to error propagation and data sparse problem. Deep neural network method has been shown to be able to learn implicit semantics of text without extra knowledge. To better learn deep underlying semantics of sentences, most deepneuralnetworklanguagemodelsutilizemulti-gramstrategy. However, the current multi-gram strategies in CNN framework are mostly realized by concatenating trained multi-gram vectors to form the sentence vector, which can increase the number of parameters to be learned and is prone to over fitting. To alleviate the problem mentioned above, we propose a novel adaptive sentence representation learning model based on multigram CNN framework. It learns adaptive importance weights of different n-gram features and forms sentence representation by using weighted sum operation on extracted n-gram features, which can largely reduce parameters to be learned and alleviate the threat of over fitting. Experimental results show that the proposed method can improve performances when be used in sentiment and relation classification tasks.