{"title":"Intra-and-inter Sentence Attention Model for Enhanced Question Answering System","authors":"Yunyi Liu, Beixi Hao","doi":"10.1109/ICAA53760.2021.00029","DOIUrl":null,"url":null,"abstract":"Task-oriented question answering dialogue systems have been an important branch of conversational systems for oral language, where they first understand the query requested by users, and the models are demanded to seek for answers within the context considering the query information. Previous work models the semantic and syntactic information without taking the interaction into consideration. In this paper, we propose an intra-and-inter sentence attention framework, equipped with self-attention mechanism, which enables the model to focus more on a small part of the context and enhances the model capability of extracting the interactive information. Our experiments are based on the Stanford Question Answering Dataset (SQuAD) and the experimental result shows that our model achieves 68.5 EM and 77.7 F1 score, which verifies how the proposed model improves on the dataset.","PeriodicalId":121879,"journal":{"name":"2021 International Conference on Intelligent Computing, Automation and Applications (ICAA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Intelligent Computing, Automation and Applications (ICAA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAA53760.2021.00029","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Task-oriented question answering dialogue systems have been an important branch of conversational systems for oral language, where they first understand the query requested by users, and the models are demanded to seek for answers within the context considering the query information. Previous work models the semantic and syntactic information without taking the interaction into consideration. In this paper, we propose an intra-and-inter sentence attention framework, equipped with self-attention mechanism, which enables the model to focus more on a small part of the context and enhances the model capability of extracting the interactive information. Our experiments are based on the Stanford Question Answering Dataset (SQuAD) and the experimental result shows that our model achieves 68.5 EM and 77.7 F1 score, which verifies how the proposed model improves on the dataset.