{"title":"An Evidential Classifier with Multiple Pre-trained Language Models for Nested Named Entity Recognition","authors":"Haitao Liu, Jihua Song, Weiming Peng","doi":"10.1145/3611450.3611477","DOIUrl":null,"url":null,"abstract":"Nested named entity recognition (NER) is an important and challenging task in information extraction. One effective approach is to detect regions in sentences that are later classified by neural networks. Since pre-trained language models (PLMs) were proposed, nested NER models have benefited a lot from them. However, it is common that only one PLM is utilized for a given model, and the performance varies with different PLMs. We note that there exist some conflicting predictions which lead to the final variation. Thus, there is still room for investigation as to whether a model could achieve even better performance by conducting a comprehensive analysis of results from various PLMs. In this paper, we propose an evidential classifier with multiple PLMs for nested NER. First, the well-known deep exhaustive model is trained separately with different PLMs, whose predictions are then treated as pieces of evidence that can be represented in the framework of Dempster-Shafer theory. Finally, the pooled evidence is obtained using a combination rule, based on which the inference is performed. Experiments are conducted on the GENIA dataset, and detailed analysis demonstrates the merits of our model.","PeriodicalId":289906,"journal":{"name":"Proceedings of the 2023 3rd International Conference on Artificial Intelligence, Automation and Algorithms","volume":"204 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 3rd International Conference on Artificial Intelligence, Automation and Algorithms","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3611450.3611477","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Nested named entity recognition (NER) is an important and challenging task in information extraction. One effective approach is to detect regions in sentences that are later classified by neural networks. Since pre-trained language models (PLMs) were proposed, nested NER models have benefited a lot from them. However, it is common that only one PLM is utilized for a given model, and the performance varies with different PLMs. We note that there exist some conflicting predictions which lead to the final variation. Thus, there is still room for investigation as to whether a model could achieve even better performance by conducting a comprehensive analysis of results from various PLMs. In this paper, we propose an evidential classifier with multiple PLMs for nested NER. First, the well-known deep exhaustive model is trained separately with different PLMs, whose predictions are then treated as pieces of evidence that can be represented in the framework of Dempster-Shafer theory. Finally, the pooled evidence is obtained using a combination rule, based on which the inference is performed. Experiments are conducted on the GENIA dataset, and detailed analysis demonstrates the merits of our model.