{"title":"基于依赖解析和注意机制的短文本问题分类","authors":"An Fang","doi":"10.1109/ICMLC48188.2019.8949314","DOIUrl":null,"url":null,"abstract":"Question texts analysis is a challenging task of the fine-grained classification due to the few annotation data and unbalanced categories. The existing approaches normally assume that each word contributes the same semantic to the question text, but ignore the different meanings of the words and the dependency relations within the text. In this paper, we propose a deep neural network with multi-layer attention mechanism to capture the extended semantic features by using a dependency parsing tree, which has the capacity to spot the central components of the question. The experimental results demonstrate that our proposed model obtains substantially improvement, comparing with several competitive baselines.","PeriodicalId":221349,"journal":{"name":"2019 International Conference on Machine Learning and Cybernetics (ICMLC)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Short-Text Question Classification Based on Dependency Parsing and Attention Mechanism\",\"authors\":\"An Fang\",\"doi\":\"10.1109/ICMLC48188.2019.8949314\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Question texts analysis is a challenging task of the fine-grained classification due to the few annotation data and unbalanced categories. The existing approaches normally assume that each word contributes the same semantic to the question text, but ignore the different meanings of the words and the dependency relations within the text. In this paper, we propose a deep neural network with multi-layer attention mechanism to capture the extended semantic features by using a dependency parsing tree, which has the capacity to spot the central components of the question. The experimental results demonstrate that our proposed model obtains substantially improvement, comparing with several competitive baselines.\",\"PeriodicalId\":221349,\"journal\":{\"name\":\"2019 International Conference on Machine Learning and Cybernetics (ICMLC)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 International Conference on Machine Learning and Cybernetics (ICMLC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMLC48188.2019.8949314\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Machine Learning and Cybernetics (ICMLC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLC48188.2019.8949314","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Short-Text Question Classification Based on Dependency Parsing and Attention Mechanism
Question texts analysis is a challenging task of the fine-grained classification due to the few annotation data and unbalanced categories. The existing approaches normally assume that each word contributes the same semantic to the question text, but ignore the different meanings of the words and the dependency relations within the text. In this paper, we propose a deep neural network with multi-layer attention mechanism to capture the extended semantic features by using a dependency parsing tree, which has the capacity to spot the central components of the question. The experimental results demonstrate that our proposed model obtains substantially improvement, comparing with several competitive baselines.