Cross-linguistic Comparison of Linguistic Feature Encoding in BERT Models for Typologically Different Languages

Yulia Otmakhova, Karin M. Verspoor, Jey Han Lau
{"title":"Cross-linguistic Comparison of Linguistic Feature Encoding in BERT Models for Typologically Different Languages","authors":"Yulia Otmakhova, Karin M. Verspoor, Jey Han Lau","doi":"10.18653/v1/2022.sigtyp-1.4","DOIUrl":null,"url":null,"abstract":"Though recently there have been an increased interest in how pre-trained language models encode different linguistic features, there is still a lack of systematic comparison between languages with different morphology and syntax. In this paper, using BERT as an example of a pre-trained model, we compare how three typologically different languages (English, Korean, and Russian) encode morphology and syntax features across different layers. In particular, we contrast languages which differ in a particular aspect, such as flexibility of word order, head directionality, morphological type, presence of grammatical gender, and morphological richness, across four different tasks.","PeriodicalId":255232,"journal":{"name":"Proceedings of the 4th Workshop on Research in Computational Linguistic Typology and Multilingual NLP","volume":"106 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th Workshop on Research in Computational Linguistic Typology and Multilingual NLP","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18653/v1/2022.sigtyp-1.4","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Though recently there have been an increased interest in how pre-trained language models encode different linguistic features, there is still a lack of systematic comparison between languages with different morphology and syntax. In this paper, using BERT as an example of a pre-trained model, we compare how three typologically different languages (English, Korean, and Russian) encode morphology and syntax features across different layers. In particular, we contrast languages which differ in a particular aspect, such as flexibility of word order, head directionality, morphological type, presence of grammatical gender, and morphological richness, across four different tasks.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
不同类型语言BERT模型中语言特征编码的跨语言比较
尽管最近人们对预训练语言模型如何编码不同的语言特征越来越感兴趣,但仍然缺乏对具有不同形态和语法的语言进行系统比较。在本文中,使用BERT作为预训练模型的示例,我们比较了三种不同类型的语言(英语,韩语和俄语)如何跨不同层编码形态学和语法特征。特别地,我们在四个不同的任务中对比了在特定方面不同的语言,如词序的灵活性、词头的方向性、形态类型、语法性别的存在和形态丰富度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Tweaking UD Annotations to Investigate the Placement of Determiners, Quantifiers and Numerals in the Noun Phrase Mockingbird at the SIGTYP 2022 Shared Task: Two Types of Models forthe Prediction of Cognate Reflexes A Transformer Architecture for the Prediction of Cognate Reflexes Bayesian Phylogenetic Cognate Prediction PaVeDa - Pavia Verbs Database: Challenges and Perspectives
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1