{"title":"Distant-Supervised Relation Extraction with Hierarchical Attention Based on Knowledge Graph","authors":"Hong Yao, Lijun Dong, Shiqi Zhen, Xiaojun Kang, Xinchuan Li, Qingzhong Liang","doi":"10.1109/ICTAI.2019.00040","DOIUrl":null,"url":null,"abstract":"Relation Extraction is concentrated on finding the unknown relational facts automatically from the unstructured texts. Most current methods, especially the distant supervision relation extraction (DSRE), have been successfully applied to achieve this goal. DSRE combines knowledge graph and text corpus to corporately generate plenty of labeled data without human efforts. However, the existing methods of DSRE ignore the noisy words within sentences and suffer from the noisy labelling problem; the additional knowledge is represented in a common semantic space and ignores the semantic-space difference between relations and entities. To address these problems, this study proposes a novel hierarchical attention model, named the Bi-GRU-based Knowledge Graph Attention Model (BG2KGA) for DSRE using the Bidirectional Gated Recurrent Unit (Bi-GRU) network. BG2KGA contains the word-level and sentence-level attentions with the guidance of additional knowledge graph, to highlight the key words and sentences respectively which can contribute more to the final relation representations. Further-more, the additional knowledge graph are embedded in the multi-semantic vector space to capture the relations in 1-N, N-1 and N-N entity pairs. Experiments are conducted on a widely used dataset for distant supervision. The experimental results have shown that the proposed model outperforms the current methods and can improve the Precision/Recall (PR) curve area by 8% to 16% compared to the state-of-the-art models; the AUC of BG2KGA can reach 0.468 in the best case.","PeriodicalId":346657,"journal":{"name":"2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTAI.2019.00040","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Relation Extraction is concentrated on finding the unknown relational facts automatically from the unstructured texts. Most current methods, especially the distant supervision relation extraction (DSRE), have been successfully applied to achieve this goal. DSRE combines knowledge graph and text corpus to corporately generate plenty of labeled data without human efforts. However, the existing methods of DSRE ignore the noisy words within sentences and suffer from the noisy labelling problem; the additional knowledge is represented in a common semantic space and ignores the semantic-space difference between relations and entities. To address these problems, this study proposes a novel hierarchical attention model, named the Bi-GRU-based Knowledge Graph Attention Model (BG2KGA) for DSRE using the Bidirectional Gated Recurrent Unit (Bi-GRU) network. BG2KGA contains the word-level and sentence-level attentions with the guidance of additional knowledge graph, to highlight the key words and sentences respectively which can contribute more to the final relation representations. Further-more, the additional knowledge graph are embedded in the multi-semantic vector space to capture the relations in 1-N, N-1 and N-N entity pairs. Experiments are conducted on a widely used dataset for distant supervision. The experimental results have shown that the proposed model outperforms the current methods and can improve the Precision/Recall (PR) curve area by 8% to 16% compared to the state-of-the-art models; the AUC of BG2KGA can reach 0.468 in the best case.