{"title":"On the Surprising Effectiveness of Name Matching Alone in Autoregressive Entity Linking","authors":"Elliot Schumacher, J. Mayfield, Mark Dredze","doi":"10.18653/v1/2023.matching-1.6","DOIUrl":null,"url":null,"abstract":"Fifteen years of work on entity linking has established the importance of different information sources in making linking decisions: mention and entity name similarity, contextual relevance, and features of the knowledge base. Modern state-of-the-art systems build on these features, including through neural representations (Wu et al., 2020). In contrast to this trend, the autoregressive language model GENRE (De Cao et al., 2021) generates normalized entity names for mentions and beats many other entity linking systems, despite making no use of knowledge base (KB) information. How is this possible? We analyze the behavior of GENRE on several entity linking datasets and demonstrate that its performance stems from memorization of name patterns. In contrast, it fails in cases that might benefit from using the KB. We experiment with a modification to the model to enable it to utilize KB information, highlighting challenges to incorporating traditional entity linking information sources into autoregressive models.","PeriodicalId":107861,"journal":{"name":"Proceedings of the First Workshop on Matching From Unstructured and Structured Data (MATCHING 2023)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the First Workshop on Matching From Unstructured and Structured Data (MATCHING 2023)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18653/v1/2023.matching-1.6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Fifteen years of work on entity linking has established the importance of different information sources in making linking decisions: mention and entity name similarity, contextual relevance, and features of the knowledge base. Modern state-of-the-art systems build on these features, including through neural representations (Wu et al., 2020). In contrast to this trend, the autoregressive language model GENRE (De Cao et al., 2021) generates normalized entity names for mentions and beats many other entity linking systems, despite making no use of knowledge base (KB) information. How is this possible? We analyze the behavior of GENRE on several entity linking datasets and demonstrate that its performance stems from memorization of name patterns. In contrast, it fails in cases that might benefit from using the KB. We experiment with a modification to the model to enable it to utilize KB information, highlighting challenges to incorporating traditional entity linking information sources into autoregressive models.
15年的实体链接工作已经确定了不同信息源在做出链接决策中的重要性:提及和实体名称的相似性、上下文相关性和知识库的特征。现代最先进的系统建立在这些特征之上,包括通过神经表示(Wu et al., 2020)。与此趋势相反,自回归语言模型GENRE (De Cao et al., 2021)为提及生成规范化的实体名称,并击败了许多其他实体链接系统,尽管不使用知识库(KB)信息。这怎么可能呢?我们分析了GENRE在几个实体链接数据集上的行为,并证明了它的性能源于名称模式的记忆。相反,在可能受益于使用KB的情况下,它会失败。我们尝试对模型进行修改,使其能够利用知识库信息,突出了将传统实体链接信息源纳入自回归模型的挑战。