评估新写作评分者在线和面对面培训的相对有效性

IF 0.1 Q4 LINGUISTICS Studies in Language Assessment Pub Date : 2018-01-01 DOI:10.58379/zvmm4117
U. Knoch, J. Fairbairn, C. Myford, A. Huisman
{"title":"评估新写作评分者在线和面对面培训的相对有效性","authors":"U. Knoch, J. Fairbairn, C. Myford, A. Huisman","doi":"10.58379/zvmm4117","DOIUrl":null,"url":null,"abstract":"Training writing raters in large-scale tests is commonly conducted face-to-face but bringing raters together for training is difficult and expensive. For this reason, more and more testing agencies are exploring technological advances with the aim of providing training online. A number of studies have examined whether online rater training is a feasible alternative to face-to-face training.This mixed methods study compared two groups of new raters, one trained online using an online training platform and the other trained using the conventional face-to-face rater training procedures. Raters who passed accreditation were also compared in the reliability of their subsequent operational ratings. The findings show that no significant differences between the rating behaviour of the two groups were identified on the writing test. The qualitative data also showed that, in general, the raters enjoyed both modes of training and felt generally sufficiently trained although some specific problems were encountered. Results on the operational ratings in the first five months after completing the training showed no significant differences between the two training groups. The paper concludes with some implications for training raters in online environments and sets out a possible programme for further research.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"47 1","pages":""},"PeriodicalIF":0.1000,"publicationDate":"2018-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Evaluating the relative effectiveness of online and face-to-face training for new writing raters\",\"authors\":\"U. Knoch, J. Fairbairn, C. Myford, A. Huisman\",\"doi\":\"10.58379/zvmm4117\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Training writing raters in large-scale tests is commonly conducted face-to-face but bringing raters together for training is difficult and expensive. For this reason, more and more testing agencies are exploring technological advances with the aim of providing training online. A number of studies have examined whether online rater training is a feasible alternative to face-to-face training.This mixed methods study compared two groups of new raters, one trained online using an online training platform and the other trained using the conventional face-to-face rater training procedures. Raters who passed accreditation were also compared in the reliability of their subsequent operational ratings. The findings show that no significant differences between the rating behaviour of the two groups were identified on the writing test. The qualitative data also showed that, in general, the raters enjoyed both modes of training and felt generally sufficiently trained although some specific problems were encountered. Results on the operational ratings in the first five months after completing the training showed no significant differences between the two training groups. The paper concludes with some implications for training raters in online environments and sets out a possible programme for further research.\",\"PeriodicalId\":29650,\"journal\":{\"name\":\"Studies in Language Assessment\",\"volume\":\"47 1\",\"pages\":\"\"},\"PeriodicalIF\":0.1000,\"publicationDate\":\"2018-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Studies in Language Assessment\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.58379/zvmm4117\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"LINGUISTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Studies in Language Assessment","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.58379/zvmm4117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"LINGUISTICS","Score":null,"Total":0}
引用次数: 4

摘要

在大规模考试中,培训写作评分员通常是面对面进行的,但将评分员聚集在一起进行培训既困难又昂贵。出于这个原因,越来越多的测试机构正在探索技术进步,目的是提供在线培训。许多研究调查了在线评分员培训是否是面对面培训的可行替代方案。这项混合方法的研究比较了两组新的评分员,一组使用在线培训平台进行在线培训,另一组使用传统的面对面评分员培训程序进行培训。通过认证的评价者也在其后续操作评级的可靠性方面进行了比较。研究结果表明,在写作测试中,两组学生的评分行为没有显著差异。定性数据还表明,一般来说,评分员喜欢这两种训练方式,虽然遇到了一些具体问题,但总的来说他们感到训练充分。在完成培训后的前5个月,两个培训组的操作评分结果没有显着差异。论文最后对在线环境下的评分员培训提出了一些建议,并提出了进一步研究的可能方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Evaluating the relative effectiveness of online and face-to-face training for new writing raters
Training writing raters in large-scale tests is commonly conducted face-to-face but bringing raters together for training is difficult and expensive. For this reason, more and more testing agencies are exploring technological advances with the aim of providing training online. A number of studies have examined whether online rater training is a feasible alternative to face-to-face training.This mixed methods study compared two groups of new raters, one trained online using an online training platform and the other trained using the conventional face-to-face rater training procedures. Raters who passed accreditation were also compared in the reliability of their subsequent operational ratings. The findings show that no significant differences between the rating behaviour of the two groups were identified on the writing test. The qualitative data also showed that, in general, the raters enjoyed both modes of training and felt generally sufficiently trained although some specific problems were encountered. Results on the operational ratings in the first five months after completing the training showed no significant differences between the two training groups. The paper concludes with some implications for training raters in online environments and sets out a possible programme for further research.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Contextual variables in written assessment feedback in a university-level Spanish program The effect of in-class and one-on-one video feedback on EFL learners’ English public speaking competency and anxiety Gebril, A. (Ed.) Learning-Oriented Language Assessment: Putting Theory into Practice. Is the devil you know better? Testwiseness and eliciting evidence of interactional competence in familiar versus unfamiliar triadic speaking tasks The meaningfulness of two curriculum-based national tests of English as a foreign language
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1