DAE-NER: Dual-channel attention enhancement for Chinese named entity recognition

IF 3.1 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Computer Speech and Language Pub Date : 2023-10-26 DOI:10.1016/j.csl.2023.101581
Jingxin Liu , Mengzhe Sun , Wenhao Zhang , Gengquan Xie , Yongxia Jing , Xiulai Li , Zhaoxin Shi
{"title":"DAE-NER: Dual-channel attention enhancement for Chinese named entity recognition","authors":"Jingxin Liu ,&nbsp;Mengzhe Sun ,&nbsp;Wenhao Zhang ,&nbsp;Gengquan Xie ,&nbsp;Yongxia Jing ,&nbsp;Xiulai Li ,&nbsp;Zhaoxin Shi","doi":"10.1016/j.csl.2023.101581","DOIUrl":null,"url":null,"abstract":"<div><p>Named Entity Recognition (NER) is an important component of Natural Language Processing (NLP) and is a fundamental yet challenging task in text analysis. Recently, NER models for Chinese-language characters have received considerable attention. Owing to the complexity and ambiguity of the Chinese language, the same semantic features have different levels of importance in different contexts. However, existing literature on Chinese Named Entity recognition (CNER) does not capture this difference in importance. To tackle this problem, we propose a new method, referred to as Dual-channel Attention Enhancement for Chinese Named Entity Recognition (DAE-NER). Specifically, we design compression and decompression mechanisms to adapt Chinese language characters to different contexts. By adjusting the weight of the semantic feature vector, the semantic weight is reconstructed to alleviate the interference of contextual differences in semantics. Moreover, in order to enhance the semantic representation of the different granularities in Chinese text, we design attention enhancement modules at the character and sentence levels. These modules dynamically learn the differences in semantic features to enhance important semantic representations in different dimensions. Extensive experiments on four benchmark datasets, namely MSRA, People Daily, Resume, and Weibo, have demonstrated that the proposed DAE-NER can effectively improve the overall performance of CNER.</p></div>","PeriodicalId":50638,"journal":{"name":"Computer Speech and Language","volume":null,"pages":null},"PeriodicalIF":3.1000,"publicationDate":"2023-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Speech and Language","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0885230823001006","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Named Entity Recognition (NER) is an important component of Natural Language Processing (NLP) and is a fundamental yet challenging task in text analysis. Recently, NER models for Chinese-language characters have received considerable attention. Owing to the complexity and ambiguity of the Chinese language, the same semantic features have different levels of importance in different contexts. However, existing literature on Chinese Named Entity recognition (CNER) does not capture this difference in importance. To tackle this problem, we propose a new method, referred to as Dual-channel Attention Enhancement for Chinese Named Entity Recognition (DAE-NER). Specifically, we design compression and decompression mechanisms to adapt Chinese language characters to different contexts. By adjusting the weight of the semantic feature vector, the semantic weight is reconstructed to alleviate the interference of contextual differences in semantics. Moreover, in order to enhance the semantic representation of the different granularities in Chinese text, we design attention enhancement modules at the character and sentence levels. These modules dynamically learn the differences in semantic features to enhance important semantic representations in different dimensions. Extensive experiments on four benchmark datasets, namely MSRA, People Daily, Resume, and Weibo, have demonstrated that the proposed DAE-NER can effectively improve the overall performance of CNER.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
DAE-NER:用于中文命名实体识别的双通道注意力增强技术
命名实体识别(NER)是自然语言处理(NLP)的一个重要组成部分,也是文本分析中一项基本但极具挑战性的任务。最近,中文字符的 NER 模型受到了广泛关注。由于中文的复杂性和模糊性,相同的语义特征在不同语境中具有不同的重要性。然而,现有的中文命名实体识别(CNER)文献并没有捕捉到这种重要性上的差异。为了解决这个问题,我们提出了一种新方法,即中文命名实体识别双通道注意力增强法(DAE-NER)。具体来说,我们设计了压缩和解压缩机制,使汉字适应不同的语境。通过调整语义特征向量的权重,重构语义权重,减轻语境差异对语义的干扰。此外,为了增强中文文本中不同粒度的语义表征,我们在字符和句子层面设计了注意力增强模块。这些模块动态学习语义特征的差异,以增强不同维度的重要语义表征。在 MSRA、人民日报、简历和微博四个基准数据集上进行的广泛实验证明,所提出的 DAE-NER 可以有效提高 CNER 的整体性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Computer Speech and Language
Computer Speech and Language 工程技术-计算机:人工智能
CiteScore
11.30
自引率
4.70%
发文量
80
审稿时长
22.9 weeks
期刊介绍: Computer Speech & Language publishes reports of original research related to the recognition, understanding, production, coding and mining of speech and language. The speech and language sciences have a long history, but it is only relatively recently that large-scale implementation of and experimentation with complex models of speech and language processing has become feasible. Such research is often carried out somewhat separately by practitioners of artificial intelligence, computer science, electronic engineering, information retrieval, linguistics, phonetics, or psychology.
期刊最新文献
Editorial Board Enhancing analysis of diadochokinetic speech using deep neural networks Copiously Quote Classics: Improving Chinese Poetry Generation with historical allusion knowledge Significance of chirp MFCC as a feature in speech and audio applications Artificial disfluency detection, uh no, disfluency generation for the masses
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1