基于神经网络融合的文本分类

IF 0.7 Q3 ENGINEERING, MULTIDISCIPLINARY TEHNICKI GLASNIK-TECHNICAL JOURNAL Pub Date : 2023-07-19 DOI:10.31803/tg-20221228154330
Dea-Won Kim
{"title":"基于神经网络融合的文本分类","authors":"Dea-Won Kim","doi":"10.31803/tg-20221228154330","DOIUrl":null,"url":null,"abstract":"The goal of text classification is to identify the category to which the text belongs. Text categorization is widely used in email detection, sentiment analysis, topic marking and other fields. However, good text representation is the point to improve the capability of NLP tasks. Traditional text representation adopts bag-of-words model or vector space model, which loses the context information of the text and faces the problems of high latitude and high sparsity,. In recent years, with the increase of data and the improvement of computing performance, the use of deep learning technology to represent and classify texts has attracted great attention. Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and RNN with attention mechanism are used to represent the text, and then to classify the text and other NLP tasks, all of which have better performance than the traditional methods. In this paper, we design two sentence-level models based on the deep network and the details are as follows: (1) Text representation and classification model based on bidirectional RNN and CNN (BRCNN). BRCNN’s input is the word vector corresponding to each word in the sentence; after using RNN to extract word order information in sentences, CNN is used to extract higher-level features of sentences. After convolution, the maximum pool operation is used to obtain sentence vectors. At last, softmax classifier is used for classification. RNN can capture the word order information in sentences, while CNN can extract useful features. Experiments on eight text classification tasks show that BRCNN model can get better text feature representation, and the classification accuracy rate is equal to or higher than that of the prior art. (2) Attention mechanism and CNN (ACNN) model uses the RNN with attention mechanism to obtain the context vector; Then CNN is used to extract more advanced feature information. The maximum pool operation is adopted to obtain a sentence vector; At last, the softmax classifier is used to classify the text. Experiments on eight text classification benchmark data sets show that ACNN improves the stability of model convergence, and can converge to an optimal or local optimal solution better than BRCNN.","PeriodicalId":43419,"journal":{"name":"TEHNICKI GLASNIK-TECHNICAL JOURNAL","volume":null,"pages":null},"PeriodicalIF":0.7000,"publicationDate":"2023-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Text Classification Based on Neural Network Fusion\",\"authors\":\"Dea-Won Kim\",\"doi\":\"10.31803/tg-20221228154330\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The goal of text classification is to identify the category to which the text belongs. Text categorization is widely used in email detection, sentiment analysis, topic marking and other fields. However, good text representation is the point to improve the capability of NLP tasks. Traditional text representation adopts bag-of-words model or vector space model, which loses the context information of the text and faces the problems of high latitude and high sparsity,. In recent years, with the increase of data and the improvement of computing performance, the use of deep learning technology to represent and classify texts has attracted great attention. Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and RNN with attention mechanism are used to represent the text, and then to classify the text and other NLP tasks, all of which have better performance than the traditional methods. In this paper, we design two sentence-level models based on the deep network and the details are as follows: (1) Text representation and classification model based on bidirectional RNN and CNN (BRCNN). BRCNN’s input is the word vector corresponding to each word in the sentence; after using RNN to extract word order information in sentences, CNN is used to extract higher-level features of sentences. After convolution, the maximum pool operation is used to obtain sentence vectors. At last, softmax classifier is used for classification. RNN can capture the word order information in sentences, while CNN can extract useful features. Experiments on eight text classification tasks show that BRCNN model can get better text feature representation, and the classification accuracy rate is equal to or higher than that of the prior art. (2) Attention mechanism and CNN (ACNN) model uses the RNN with attention mechanism to obtain the context vector; Then CNN is used to extract more advanced feature information. The maximum pool operation is adopted to obtain a sentence vector; At last, the softmax classifier is used to classify the text. Experiments on eight text classification benchmark data sets show that ACNN improves the stability of model convergence, and can converge to an optimal or local optimal solution better than BRCNN.\",\"PeriodicalId\":43419,\"journal\":{\"name\":\"TEHNICKI GLASNIK-TECHNICAL JOURNAL\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.7000,\"publicationDate\":\"2023-07-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"TEHNICKI GLASNIK-TECHNICAL JOURNAL\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.31803/tg-20221228154330\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"TEHNICKI GLASNIK-TECHNICAL JOURNAL","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.31803/tg-20221228154330","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

文本分类的目标是确定文本所属的类别。文本分类广泛应用于邮件检测、情感分析、主题标注等领域。然而,良好的文本表示是提高NLP任务能力的关键。传统的文本表示采用词袋模型或向量空间模型,失去了文本的上下文信息,面临高纬度和高稀疏度的问题。近年来,随着数据量的增加和计算性能的提高,利用深度学习技术对文本进行表示和分类备受关注。利用卷积神经网络(CNN)、递归神经网络(RNN)和带注意机制的RNN对文本进行表征,然后对文本进行分类等NLP任务,都比传统方法具有更好的性能。本文设计了两个基于深度网络的句子级模型,具体如下:(1)基于双向RNN和CNN的文本表示与分类模型(BRCNN)。BRCNN的输入是句子中每个单词对应的单词向量;在使用RNN提取句子中的词序信息后,使用CNN提取句子的高级特征。卷积后,使用最大池运算获得句子向量。最后,使用softmax分类器进行分类。RNN可以捕获句子中的词序信息,而CNN可以提取有用的特征。在8个文本分类任务上的实验表明,BRCNN模型可以获得更好的文本特征表示,分类准确率等于或高于现有技术。(2)注意机制与CNN (ACNN)模型利用带有注意机制的RNN获取上下文向量;然后利用CNN提取更高级的特征信息。采用最大池运算获得句子向量;最后,使用softmax分类器对文本进行分类。在8个文本分类基准数据集上的实验表明,ACNN提高了模型收敛的稳定性,比BRCNN更能收敛到最优解或局部最优解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Text Classification Based on Neural Network Fusion
The goal of text classification is to identify the category to which the text belongs. Text categorization is widely used in email detection, sentiment analysis, topic marking and other fields. However, good text representation is the point to improve the capability of NLP tasks. Traditional text representation adopts bag-of-words model or vector space model, which loses the context information of the text and faces the problems of high latitude and high sparsity,. In recent years, with the increase of data and the improvement of computing performance, the use of deep learning technology to represent and classify texts has attracted great attention. Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and RNN with attention mechanism are used to represent the text, and then to classify the text and other NLP tasks, all of which have better performance than the traditional methods. In this paper, we design two sentence-level models based on the deep network and the details are as follows: (1) Text representation and classification model based on bidirectional RNN and CNN (BRCNN). BRCNN’s input is the word vector corresponding to each word in the sentence; after using RNN to extract word order information in sentences, CNN is used to extract higher-level features of sentences. After convolution, the maximum pool operation is used to obtain sentence vectors. At last, softmax classifier is used for classification. RNN can capture the word order information in sentences, while CNN can extract useful features. Experiments on eight text classification tasks show that BRCNN model can get better text feature representation, and the classification accuracy rate is equal to or higher than that of the prior art. (2) Attention mechanism and CNN (ACNN) model uses the RNN with attention mechanism to obtain the context vector; Then CNN is used to extract more advanced feature information. The maximum pool operation is adopted to obtain a sentence vector; At last, the softmax classifier is used to classify the text. Experiments on eight text classification benchmark data sets show that ACNN improves the stability of model convergence, and can converge to an optimal or local optimal solution better than BRCNN.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
TEHNICKI GLASNIK-TECHNICAL JOURNAL
TEHNICKI GLASNIK-TECHNICAL JOURNAL ENGINEERING, MULTIDISCIPLINARY-
CiteScore
1.50
自引率
8.30%
发文量
85
审稿时长
15 weeks
期刊最新文献
Standardization of Project Management Practices of Automotive Industry Suppliers Technical Characteristics of Incunabulum in Europe Face Detection and Recognition Using Raspberry PI Computer A Returnable Transport Item to Integrate Logistics 4.0 and Circular Economy in Pharma Supply Chains Modelling Freight Allocation and Transportation Lead-Time
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1