{"title":"基于双层注意机制的语义匹配模型","authors":"Zhenzhen Hou, Xiaodong Cai, Si Chen, Bo Li","doi":"10.1109/ICIASE45644.2019.9074041","DOIUrl":null,"url":null,"abstract":"In community question and answering (Q&A) systems, due to the diversity of words and syntactic structure, matching question pairs representing similar meaning is a challenging task. A novel model based on dual-layer attention mechanism for semantic matching is proposed in this work. Firstly, an attention-based preprocessing method is used on the word representation layer to reduce redundant information. Secondly, a bilateral multiple perspectives attention mechanism is utilized on the context representation layer to obtain more interactive information. Finally, the obtained information is passed through a Bi-directional Long Short Term Memory Network (BiLSTM). Then the obtained final time steps of the two sequences are combined for prediction. The experimental results show that the accuracy of the proposed model in our self-defined Chinese dataset is up to 95.54% and also 88.91% with Quora dataset. It outperforms the existing advanced benchmark models. The model also provides stability and scalability to natural language inference tasks with the accuracy of 87.4% in the Stanford Natural Language Inference (SNLI) dataset.","PeriodicalId":206741,"journal":{"name":"2019 IEEE International Conference of Intelligent Applied Systems on Engineering (ICIASE)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"A model based on dual-layer attention mechanism for semantic matching\",\"authors\":\"Zhenzhen Hou, Xiaodong Cai, Si Chen, Bo Li\",\"doi\":\"10.1109/ICIASE45644.2019.9074041\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In community question and answering (Q&A) systems, due to the diversity of words and syntactic structure, matching question pairs representing similar meaning is a challenging task. A novel model based on dual-layer attention mechanism for semantic matching is proposed in this work. Firstly, an attention-based preprocessing method is used on the word representation layer to reduce redundant information. Secondly, a bilateral multiple perspectives attention mechanism is utilized on the context representation layer to obtain more interactive information. Finally, the obtained information is passed through a Bi-directional Long Short Term Memory Network (BiLSTM). Then the obtained final time steps of the two sequences are combined for prediction. The experimental results show that the accuracy of the proposed model in our self-defined Chinese dataset is up to 95.54% and also 88.91% with Quora dataset. It outperforms the existing advanced benchmark models. The model also provides stability and scalability to natural language inference tasks with the accuracy of 87.4% in the Stanford Natural Language Inference (SNLI) dataset.\",\"PeriodicalId\":206741,\"journal\":{\"name\":\"2019 IEEE International Conference of Intelligent Applied Systems on Engineering (ICIASE)\",\"volume\":\"45 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE International Conference of Intelligent Applied Systems on Engineering (ICIASE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICIASE45644.2019.9074041\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference of Intelligent Applied Systems on Engineering (ICIASE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIASE45644.2019.9074041","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A model based on dual-layer attention mechanism for semantic matching
In community question and answering (Q&A) systems, due to the diversity of words and syntactic structure, matching question pairs representing similar meaning is a challenging task. A novel model based on dual-layer attention mechanism for semantic matching is proposed in this work. Firstly, an attention-based preprocessing method is used on the word representation layer to reduce redundant information. Secondly, a bilateral multiple perspectives attention mechanism is utilized on the context representation layer to obtain more interactive information. Finally, the obtained information is passed through a Bi-directional Long Short Term Memory Network (BiLSTM). Then the obtained final time steps of the two sequences are combined for prediction. The experimental results show that the accuracy of the proposed model in our self-defined Chinese dataset is up to 95.54% and also 88.91% with Quora dataset. It outperforms the existing advanced benchmark models. The model also provides stability and scalability to natural language inference tasks with the accuracy of 87.4% in the Stanford Natural Language Inference (SNLI) dataset.