面向低资源语言语音识别的多语言转换语言模型

Li Miao, Jian Wu, Piyush Behre, Shuangyu Chang, S. Parthasarathy
{"title":"面向低资源语言语音识别的多语言转换语言模型","authors":"Li Miao, Jian Wu, Piyush Behre, Shuangyu Chang, S. Parthasarathy","doi":"10.1109/SNAMS58071.2022.10062774","DOIUrl":null,"url":null,"abstract":"It is challenging to train and deploy Transformer Language Models (LMs) for hybrid speech recognition second pass re-ranking in low-resource languages due to (1) data scarcity in low-resource languages, (2) expensive computing costs for training and refreshing 100+ monolingual models, and (3) hosting inefficiency considering sparse traffic. In this study, we present a novel way to group multiple low-resource locales together and optimize the performance of Multilingual Transformer LMs in ASR. Our Locale-group Multilingual Transformer LMs outperform traditional multilingual LMs along with reducing maintenance costs and operating expenses. Further, for high-traffic locales where deploying monolingual models is feasible, we show that fine-tuning our locale-group multilingual LMs produces better monolingual LM candidates than baseline monolingual LMs.","PeriodicalId":371668,"journal":{"name":"2022 Ninth International Conference on Social Networks Analysis, Management and Security (SNAMS)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multilingual Transformer Language Model for Speech Recognition in Low-resource Languages\",\"authors\":\"Li Miao, Jian Wu, Piyush Behre, Shuangyu Chang, S. Parthasarathy\",\"doi\":\"10.1109/SNAMS58071.2022.10062774\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"It is challenging to train and deploy Transformer Language Models (LMs) for hybrid speech recognition second pass re-ranking in low-resource languages due to (1) data scarcity in low-resource languages, (2) expensive computing costs for training and refreshing 100+ monolingual models, and (3) hosting inefficiency considering sparse traffic. In this study, we present a novel way to group multiple low-resource locales together and optimize the performance of Multilingual Transformer LMs in ASR. Our Locale-group Multilingual Transformer LMs outperform traditional multilingual LMs along with reducing maintenance costs and operating expenses. Further, for high-traffic locales where deploying monolingual models is feasible, we show that fine-tuning our locale-group multilingual LMs produces better monolingual LM candidates than baseline monolingual LMs.\",\"PeriodicalId\":371668,\"journal\":{\"name\":\"2022 Ninth International Conference on Social Networks Analysis, Management and Security (SNAMS)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-09-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 Ninth International Conference on Social Networks Analysis, Management and Security (SNAMS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SNAMS58071.2022.10062774\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Ninth International Conference on Social Networks Analysis, Management and Security (SNAMS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SNAMS58071.2022.10062774","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

由于(1)低资源语言的数据稀缺性,(2)训练和刷新100多个单语言模型的计算成本昂贵,以及(3)考虑稀疏流量的托管效率低下,训练和部署用于混合语音识别二次排序的转换语言模型(lm)具有挑战性。在这项研究中,我们提出了一种将多个低资源区域分组在一起的新方法,并优化了多语言转换器LMs在ASR中的性能。我们的Locale-group多语言转换器LMs在降低维护成本和运营费用的同时,优于传统的多语言LMs。此外,对于部署单语言模型是可行的高流量地区,我们表明微调我们的地区组多语言LM产生比基线单语言LM更好的单语言LM候选。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Multilingual Transformer Language Model for Speech Recognition in Low-resource Languages
It is challenging to train and deploy Transformer Language Models (LMs) for hybrid speech recognition second pass re-ranking in low-resource languages due to (1) data scarcity in low-resource languages, (2) expensive computing costs for training and refreshing 100+ monolingual models, and (3) hosting inefficiency considering sparse traffic. In this study, we present a novel way to group multiple low-resource locales together and optimize the performance of Multilingual Transformer LMs in ASR. Our Locale-group Multilingual Transformer LMs outperform traditional multilingual LMs along with reducing maintenance costs and operating expenses. Further, for high-traffic locales where deploying monolingual models is feasible, we show that fine-tuning our locale-group multilingual LMs produces better monolingual LM candidates than baseline monolingual LMs.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Classifying Arabian Gulf Tweets to Detect People's Trends: A case study Implicit User Network Analysis of Communication Platform Open Data for Channel Recommendation Anomalous/Relevant Event Detection (A/RED): Active Machine Learning for Finding Rare Events Knowledge Management Role in Enhancing Customer Relationship Management in Hotels Industry in the UK Social Media Acceptance and e-Learning Post-Covid-19: New factors determine the extension of TAM
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1