KRongBERT: Enhanced factorization-based morphological approach for the Korean pretrained language model

IF 6.9 1区 管理学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Information Processing & Management Pub Date : 2025-05-01 Epub Date: 2025-01-27 DOI:10.1016/j.ipm.2025.104072
Hyunwook Yu , Yejin Cho , Geunchul Park , Mucheol Kim
{"title":"KRongBERT: Enhanced factorization-based morphological approach for the Korean pretrained language model","authors":"Hyunwook Yu ,&nbsp;Yejin Cho ,&nbsp;Geunchul Park ,&nbsp;Mucheol Kim","doi":"10.1016/j.ipm.2025.104072","DOIUrl":null,"url":null,"abstract":"<div><div>The bidirectional encoder representations from transformers (BERT) model has achieved remarkable success in various natural language processing tasks for Latin-based languages. However, the Korean language presents unique challenges with limited data resources and complex linguistic structures. In this paper, we present KRongBERT, a language model specifically designed through a morphological approach to effectively address the unique linguistic complexities of Korean. KRongBERT mitigates the out-of-vocabulary issues that arise with byte-pair-encoding tokenizers in Korean and incorporates language-specific embedding layers to enhance understanding. Our model demonstrates up to an 1.56% improvement in performance on specific natural language understanding tasks compared to the traditional BERT implementations. Notably, KRongBERT achieves superior performance compared to existing state-of-the-art Korean BERT models while utilizing only 11.42% of the data required by other models. Our experiments demonstrate that KRongBERT efficiently handles the complexities of the Korean language, outperforming current state-of-the-art approaches. The code is publicly available at <span><span>https://github.com/Splo2t/KRongBERT</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50365,"journal":{"name":"Information Processing & Management","volume":"62 3","pages":"Article 104072"},"PeriodicalIF":6.9000,"publicationDate":"2025-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Processing & Management","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306457325000147","RegionNum":1,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/27 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

The bidirectional encoder representations from transformers (BERT) model has achieved remarkable success in various natural language processing tasks for Latin-based languages. However, the Korean language presents unique challenges with limited data resources and complex linguistic structures. In this paper, we present KRongBERT, a language model specifically designed through a morphological approach to effectively address the unique linguistic complexities of Korean. KRongBERT mitigates the out-of-vocabulary issues that arise with byte-pair-encoding tokenizers in Korean and incorporates language-specific embedding layers to enhance understanding. Our model demonstrates up to an 1.56% improvement in performance on specific natural language understanding tasks compared to the traditional BERT implementations. Notably, KRongBERT achieves superior performance compared to existing state-of-the-art Korean BERT models while utilizing only 11.42% of the data required by other models. Our experiments demonstrate that KRongBERT efficiently handles the complexities of the Korean language, outperforming current state-of-the-art approaches. The code is publicly available at https://github.com/Splo2t/KRongBERT.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
kronbert:韩语预训练语言模型中基于因子分解的增强形态学方法
基于变换的双向编码器表示(BERT)模型在各种基于拉丁语言的自然语言处理任务中取得了显著的成功。然而,由于数据资源有限,语言结构复杂,韩语面临着独特的挑战。在本文中,我们提出了kronbert,一个专门通过形态学方法设计的语言模型,以有效地解决韩国语独特的语言复杂性。KRongBERT减轻了韩语中字节对编码标记器出现的词汇表外问题,并结合了特定于语言的嵌入层来增强理解。与传统的BERT实现相比,我们的模型在特定的自然语言理解任务上的性能提高了1.56%。值得注意的是,KRongBERT在使用其他模型所需数据的11.42%的情况下,取得了比现有最先进的韩国BERT模型更好的性能。我们的实验表明,KRongBERT有效地处理了韩国语的复杂性,优于目前最先进的方法。该代码可在https://github.com/Splo2t/KRongBERT上公开获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Information Processing & Management
Information Processing & Management 工程技术-计算机:信息系统
CiteScore
17.00
自引率
11.60%
发文量
276
审稿时长
39 days
期刊介绍: Information Processing and Management is dedicated to publishing cutting-edge original research at the convergence of computing and information science. Our scope encompasses theory, methods, and applications across various domains, including advertising, business, health, information science, information technology marketing, and social computing. We aim to cater to the interests of both primary researchers and practitioners by offering an effective platform for the timely dissemination of advanced and topical issues in this interdisciplinary field. The journal places particular emphasis on original research articles, research survey articles, research method articles, and articles addressing critical applications of research. Join us in advancing knowledge and innovation at the intersection of computing and information science.
期刊最新文献
PhiMark: watermarking relational data robustly with zero distortion A self-guided few-shot semantic segmentation model based on query foreground-background similarity Emotion and noise-robust speaker identification via filter-free self-supervised learning TemFRC: Enterprise financial risk prediction with temporal folding and risk contrast A dual-source knowledge distillation framework for hate speech detection based on cognitive distortion awareness
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1