Speech data collection system for Kazakh language

Darkhan Kuanyshbay, O. Baimuratov, Y. Amirgaliyev, Arailym Kuanyshbayeva
{"title":"Speech data collection system for Kazakh language","authors":"Darkhan Kuanyshbay, O. Baimuratov, Y. Amirgaliyev, Arailym Kuanyshbayeva","doi":"10.1109/icecco53203.2021.9663771","DOIUrl":null,"url":null,"abstract":"Speech data in most of the languages that have a low resource doesn’t even exist. Therefore, producing speech corpora is very challenging and requires tremendous amount of time. Kazakh language due to its lack of popularity considered to be low-resource language. This paper provides an overview on many data collection techniques, marking some of the issues related to a particular method. The main aim of this paper is to present crowd sourcing web-based tool called “Kazakh recorder” which accessible on the website and designed to make the collection of Kazakh speech data more conveniently and quickly. Moreover, this section provides a statistics of people (age, gender, number of sentences) who made a contribution on collecting this speech data. Using this tool, we have collected over 50 hours of speech data 65 different native speakers, each having on average 500 sentences pronounced in Kazakh language.","PeriodicalId":331369,"journal":{"name":"2021 16th International Conference on Electronics Computer and Computation (ICECCO)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2021-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 16th International Conference on Electronics Computer and Computation (ICECCO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/icecco53203.2021.9663771","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Speech data in most of the languages that have a low resource doesn’t even exist. Therefore, producing speech corpora is very challenging and requires tremendous amount of time. Kazakh language due to its lack of popularity considered to be low-resource language. This paper provides an overview on many data collection techniques, marking some of the issues related to a particular method. The main aim of this paper is to present crowd sourcing web-based tool called “Kazakh recorder” which accessible on the website and designed to make the collection of Kazakh speech data more conveniently and quickly. Moreover, this section provides a statistics of people (age, gender, number of sentences) who made a contribution on collecting this speech data. Using this tool, we have collected over 50 hours of speech data 65 different native speakers, each having on average 500 sentences pronounced in Kazakh language.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
哈萨克语语音数据采集系统
在大多数资源少的语言中,语音数据甚至不存在。因此,制作语音语料库是非常具有挑战性的,需要大量的时间。哈萨克语由于其缺乏知名度被认为是低资源语言。本文概述了许多数据收集技术,标记了与特定方法相关的一些问题。本文的主要目的是提出一种基于网络的众包工具“哈萨克语记录器”,该工具可在网站上访问,旨在使哈萨克语语音数据的收集更加方便和快速。此外,本节还提供了对收集该语音数据做出贡献的人员(年龄,性别,句子数量)的统计数据。使用这个工具,我们收集了超过50小时的语音数据,65位母语为哈萨克语的人,平均每人有500个哈萨克语句子。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Question answering model construction by using transfer learning Breast cancer histopathology image classification using CNN MBTI personality classification using Apache Spark ICECCO 2021 Table of contents Part-of-speech tagging of Kazakh text via LSTM network with a bidirectional modifier
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1