{"title":"A novel method of text representation on hybrid neural networks","authors":"Yanbu Guo, Chen Jin, Weihua Li, Chen Ji, Yuanye Fang, Yunhao Duan","doi":"10.1109/CISP-BMEI.2017.8302099","DOIUrl":null,"url":null,"abstract":"Text representation is one of the fundamental problems in text analysis tasks. The key of text representation is to extract and express the semantic and syntax feature of texts. The order-sensitive sequence models based on neural networks have achieved great progress in text representation. Bidirectional Long Short-Term Memory (BiLSTM) Neural Networks, as an extension of Recurrent Neural Networks (RNN), not only can deal with variable-length texts, capture the long-term dependencies in texts, but also model the forward and backward sequence contexts. Moreover, typical neural networks, Convolutional Neural Networks (CNN), can extract more semantic and structural information from texts, because of their convolution and pooling operations. The paper proposes a hybrid model, which combines the BiLSTM with 2-dimensial convolution and 1-dimensial pooling operations. In other words, the model firstly captures the abstract representation vector of texts by the BiLSTM, and then extracts text semantic features by 2-dimensial convolutional and 1-dimensial pooling operations. Experiments on text classification tasks show that our method obtains preferable performances compared with the state-of-the-art models when applied on the MR1 sentence polarity dataset.","PeriodicalId":6474,"journal":{"name":"2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)","volume":"43 1","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISP-BMEI.2017.8302099","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Text representation is one of the fundamental problems in text analysis tasks. The key of text representation is to extract and express the semantic and syntax feature of texts. The order-sensitive sequence models based on neural networks have achieved great progress in text representation. Bidirectional Long Short-Term Memory (BiLSTM) Neural Networks, as an extension of Recurrent Neural Networks (RNN), not only can deal with variable-length texts, capture the long-term dependencies in texts, but also model the forward and backward sequence contexts. Moreover, typical neural networks, Convolutional Neural Networks (CNN), can extract more semantic and structural information from texts, because of their convolution and pooling operations. The paper proposes a hybrid model, which combines the BiLSTM with 2-dimensial convolution and 1-dimensial pooling operations. In other words, the model firstly captures the abstract representation vector of texts by the BiLSTM, and then extracts text semantic features by 2-dimensial convolutional and 1-dimensial pooling operations. Experiments on text classification tasks show that our method obtains preferable performances compared with the state-of-the-art models when applied on the MR1 sentence polarity dataset.