{"title":"Global Span Semantic Dependency Awareness and Filtering Network for nested named entity recognition","authors":"Yunlei Sun, Xiaoyang Wang, Haosheng Wu, Miao Hu","doi":"10.1016/j.neucom.2024.129035","DOIUrl":null,"url":null,"abstract":"<div><div>Span-based methods for nested named entity recognition (NER) are effective in handling the complexities of nested entities with hierarchical structures. However, these methods often overlook valid semantic dependencies among global spans, resulting in a partial loss of semantic information. To address this issue, we propose the Global Span Semantic Dependency Awareness and Filtering Network (GSSDAF). Our model begins with BERT for initial sentence encoding. Following this, a span semantic representation matrix is generated using a multi-head biaffine attention mechanism. We introduce the Global Span Dependency Awareness (GSDA) module to capture valid semantic dependencies among all spans, and the Local Span Dependency Enhancement (LSDE) module to selectively enhance key local dependencies. The enhanced span semantic representation matrix is then decoded to classify the spans. We evaluated our model on seven public datasets. Experimental results demonstrate that our model effectively handles nested NER, achieving higher F1 scores compared to baselines. Ablation experiments confirm the effectiveness of each module. Further analysis indicates that our model can learn valid semantic dependencies between global spans, significantly improving the accuracy of nested entity recognition. Our code is available at <span><span>https://github.com/Shaun-Wong/GSSDAF</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"617 ","pages":"Article 129035"},"PeriodicalIF":5.5000,"publicationDate":"2024-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S092523122401806X","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Span-based methods for nested named entity recognition (NER) are effective in handling the complexities of nested entities with hierarchical structures. However, these methods often overlook valid semantic dependencies among global spans, resulting in a partial loss of semantic information. To address this issue, we propose the Global Span Semantic Dependency Awareness and Filtering Network (GSSDAF). Our model begins with BERT for initial sentence encoding. Following this, a span semantic representation matrix is generated using a multi-head biaffine attention mechanism. We introduce the Global Span Dependency Awareness (GSDA) module to capture valid semantic dependencies among all spans, and the Local Span Dependency Enhancement (LSDE) module to selectively enhance key local dependencies. The enhanced span semantic representation matrix is then decoded to classify the spans. We evaluated our model on seven public datasets. Experimental results demonstrate that our model effectively handles nested NER, achieving higher F1 scores compared to baselines. Ablation experiments confirm the effectiveness of each module. Further analysis indicates that our model can learn valid semantic dependencies between global spans, significantly improving the accuracy of nested entity recognition. Our code is available at https://github.com/Shaun-Wong/GSSDAF.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.