{"title":"使用语法感知注意机制推进命名实体识别","authors":"Tomasz Jason, Muly Neumann, Ammar Adam","doi":"10.21203/rs.3.rs-3593960/v1","DOIUrl":null,"url":null,"abstract":"Abstract Named entity recognition (NER) stands as a pivotal task in natural language processing (NLP). Recent advancements have considerably enhanced its effectiveness. However, a gap remains in these systems' ability to fully leverage the recursive dynamics of linguistic structures. This study introduces a novel approach, intertwining the recognition of entities with a deeper understanding of linguistic syntax and tree-like structures. Utilizing a Tree-LSTM that operates under the guidance of dependency trees, we capture the intricate syntactic relationships between words. This process is further refined through the dual application of relative and global attention mechanisms. The relative attention zone is on critical words in the context of each evaluated word, whereas global attention identifies keywords throughout the entire sentence. By projecting these attention-modulated features into a tagging space, our model employs a conditional random field classifier to determine entity labels. We discover that our model adeptly highlights verbs that reveal the types of entities, influenced by their syntactic roles within sentences. Our model sets a new benchmark for performance on two prominent datasets, substantiating our approach.","PeriodicalId":500086,"journal":{"name":"Research Square (Research Square)","volume":"7 7","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Advancing Named Entity Recognition with Syntax-Aware Attention Mechanisms\",\"authors\":\"Tomasz Jason, Muly Neumann, Ammar Adam\",\"doi\":\"10.21203/rs.3.rs-3593960/v1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Named entity recognition (NER) stands as a pivotal task in natural language processing (NLP). Recent advancements have considerably enhanced its effectiveness. However, a gap remains in these systems' ability to fully leverage the recursive dynamics of linguistic structures. This study introduces a novel approach, intertwining the recognition of entities with a deeper understanding of linguistic syntax and tree-like structures. Utilizing a Tree-LSTM that operates under the guidance of dependency trees, we capture the intricate syntactic relationships between words. This process is further refined through the dual application of relative and global attention mechanisms. The relative attention zone is on critical words in the context of each evaluated word, whereas global attention identifies keywords throughout the entire sentence. By projecting these attention-modulated features into a tagging space, our model employs a conditional random field classifier to determine entity labels. We discover that our model adeptly highlights verbs that reveal the types of entities, influenced by their syntactic roles within sentences. Our model sets a new benchmark for performance on two prominent datasets, substantiating our approach.\",\"PeriodicalId\":500086,\"journal\":{\"name\":\"Research Square (Research Square)\",\"volume\":\"7 7\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-11-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Research Square (Research Square)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.21203/rs.3.rs-3593960/v1\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research Square (Research Square)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.21203/rs.3.rs-3593960/v1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Advancing Named Entity Recognition with Syntax-Aware Attention Mechanisms
Abstract Named entity recognition (NER) stands as a pivotal task in natural language processing (NLP). Recent advancements have considerably enhanced its effectiveness. However, a gap remains in these systems' ability to fully leverage the recursive dynamics of linguistic structures. This study introduces a novel approach, intertwining the recognition of entities with a deeper understanding of linguistic syntax and tree-like structures. Utilizing a Tree-LSTM that operates under the guidance of dependency trees, we capture the intricate syntactic relationships between words. This process is further refined through the dual application of relative and global attention mechanisms. The relative attention zone is on critical words in the context of each evaluated word, whereas global attention identifies keywords throughout the entire sentence. By projecting these attention-modulated features into a tagging space, our model employs a conditional random field classifier to determine entity labels. We discover that our model adeptly highlights verbs that reveal the types of entities, influenced by their syntactic roles within sentences. Our model sets a new benchmark for performance on two prominent datasets, substantiating our approach.