{"title":"Semantic Relational Extraction via Learning Syntactic Structural Representation","authors":"Nguyen Éric, Tsuruoka Ari","doi":"10.21203/rs.3.rs-3593929/v1","DOIUrl":null,"url":null,"abstract":"Abstract Leveraging distant supervision for relation extraction has emerged as a robust method to harness large text corpora, widely adopted to unearth new relational facts from unstructured text. Prevailing neural approaches have made significant strides in relation extraction by representing sentences in compact, low-dimensional vectors. However, the incorporation of syntactic nuances when modeling entities remains underexplored. Our study introduces a novel method for crafting syntax-aware entity embeddings to boost neural relation extraction. We start by encoding entity contexts within dependency trees through tree-GRU to generate sentence-level entity embeddings. We then apply both intra-sentence and inter-sentence attention mechanisms to distill entity embeddings at the sentence set level, considering every occurrence of the pertinent entity pair. The culmination of our methodology is the fusion of sentence and entity embeddings for relation classification. Our experiments on a benchmark dataset indicate that our approach harnesses the full potential of informative instances, thereby setting new benchmarks for relation extraction performance.","PeriodicalId":500086,"journal":{"name":"Research Square (Research Square)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research Square (Research Square)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.21203/rs.3.rs-3593929/v1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Abstract Leveraging distant supervision for relation extraction has emerged as a robust method to harness large text corpora, widely adopted to unearth new relational facts from unstructured text. Prevailing neural approaches have made significant strides in relation extraction by representing sentences in compact, low-dimensional vectors. However, the incorporation of syntactic nuances when modeling entities remains underexplored. Our study introduces a novel method for crafting syntax-aware entity embeddings to boost neural relation extraction. We start by encoding entity contexts within dependency trees through tree-GRU to generate sentence-level entity embeddings. We then apply both intra-sentence and inter-sentence attention mechanisms to distill entity embeddings at the sentence set level, considering every occurrence of the pertinent entity pair. The culmination of our methodology is the fusion of sentence and entity embeddings for relation classification. Our experiments on a benchmark dataset indicate that our approach harnesses the full potential of informative instances, thereby setting new benchmarks for relation extraction performance.