Evaluation Framework Design of Spoken Term Detection Study at the NTCIR-9 IR for Spoken Documents Task

Q4 Computer Science Journal of Information Processing Pub Date : 2012-12-14 DOI:10.5715/JNLP.19.329
H. Nishizaki, T. Akiba, K. Aikawa, Tatsuya Kawahara, T. Matsui
{"title":"Evaluation Framework Design of Spoken Term Detection Study at the NTCIR-9 IR for Spoken Documents Task","authors":"H. Nishizaki, T. Akiba, K. Aikawa, Tatsuya Kawahara, T. Matsui","doi":"10.5715/JNLP.19.329","DOIUrl":null,"url":null,"abstract":"This paper describes a design of spoken term detection (STD) studies and their evaluating framework at the STD sub-task of the NTCIR-9 IR for Spoken Documents (SpokenDoc) task. STD is the one of information access technologies for spoken documents. The goal of the STD sub-task is to rapidly detect presence of a given query term, consisting of word or a few word sequences spoken, from the spoken documents included in the Corpus of Spontaneous Japanese. To successfully complete the sub-task, we considered the design of the sub-task and the evaluation methods, and arranged the task schedule. Finally, seven teams participated in the STD subtask and submitted 18 STD results. This paper explains the STD sub-task details we conducted, the data used in the sub-task, how to make transcriptions by speech recognition for data distribution, the evaluation measurement, introduction of the participants’ techniques, and the evaluation results of the task participants.","PeriodicalId":16243,"journal":{"name":"Journal of Information Processing","volume":"19 1","pages":"329-350"},"PeriodicalIF":0.0000,"publicationDate":"2012-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Information Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5715/JNLP.19.329","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 4

Abstract

This paper describes a design of spoken term detection (STD) studies and their evaluating framework at the STD sub-task of the NTCIR-9 IR for Spoken Documents (SpokenDoc) task. STD is the one of information access technologies for spoken documents. The goal of the STD sub-task is to rapidly detect presence of a given query term, consisting of word or a few word sequences spoken, from the spoken documents included in the Corpus of Spontaneous Japanese. To successfully complete the sub-task, we considered the design of the sub-task and the evaluation methods, and arranged the task schedule. Finally, seven teams participated in the STD subtask and submitted 18 STD results. This paper explains the STD sub-task details we conducted, the data used in the sub-task, how to make transcriptions by speech recognition for data distribution, the evaluation measurement, introduction of the participants’ techniques, and the evaluation results of the task participants.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
ntir - 9ir口语文件任务中口语词汇检测研究的评价框架设计
本文描述了ntir -9口语文件检索(SpokenDoc)任务的STD子任务中口语术语检测(STD)研究的设计及其评估框架。STD是一种有声文件信息访问技术。STD子任务的目标是从自发日语语料库中包含的口语文档中快速检测给定查询词(由单词或几个单词序列组成)的存在。为了顺利完成子任务,我们考虑了子任务的设计和评估方法,并安排了任务时间表。最后,有7个团队参与了STD子任务,并提交了18个STD结果。本文阐述了STD子任务的细节,子任务中使用的数据,如何通过语音识别进行转录进行数据分布,评估测量,参与者技术介绍,以及任务参与者的评估结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Information Processing
Journal of Information Processing Computer Science-Computer Science (all)
CiteScore
1.20
自引率
0.00%
发文量
0
期刊最新文献
Container-native Managed Data Sharing Editor's Message to Special Issue of Computer Security Technologies for Secure Cyberspace Understanding the Inconsistencies in the Permissions Mechanism of Web Browsers An Analysis of Susceptibility to Phishing via Business Chat through Online Survey Analysis and Consideration of Detection Methods to Prevent Fraudulent Access by Utilizing Attribute Information and the Access Log History
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1