Comparing Chinese L2 writing performance in paper-based and computer-based modes: Perspectives from the writing product and process

IF 4.2 1区 文学 Q1 EDUCATION & EDUCATIONAL RESEARCH Assessing Writing Pub Date : 2024-05-31 DOI:10.1016/j.asw.2024.100849
Xiaozhu Wang, Jimin Wang
{"title":"Comparing Chinese L2 writing performance in paper-based and computer-based modes: Perspectives from the writing product and process","authors":"Xiaozhu Wang,&nbsp;Jimin Wang","doi":"10.1016/j.asw.2024.100849","DOIUrl":null,"url":null,"abstract":"<div><p>As writing is a complex language-producing process dependent on the writing environment and medium, the comparability of computer-based (CB) and paper-based (PB) writing assessments has been studied extensively since the emergence of computer-based language writing assessment. This study investigated the differences in the writing product and process between CB and PB modes of writing assessment in Chinese as a second language, of which the character writing system is considered challenging for learners. The many-facet Rasch model (MFRM) was adopted to reveal the text quality differences. Keystrokes and handwriting trace data were utilized to unveil insights into the writing process. The results showed that Chinese L2 learners generated higher-quality texts with fewer character mistakes in the CB mode. They revised much more, paused shorter and less frequently between lower-level linguistic units in the CB mode. The quality of CB text is associated with revision behavior, whereas pause duration serves as a stronger predictor of PB text quality. The findings suggest that the act of handwriting Chinese characters makes the construct of PB distinct from the CB writing assessment in L2 Chinese. Thus, the setting of the assessment mode should consider the target language use and the test taker’s characteristics.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"61 ","pages":"Article 100849"},"PeriodicalIF":4.2000,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessing Writing","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1075293524000424","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

Abstract

As writing is a complex language-producing process dependent on the writing environment and medium, the comparability of computer-based (CB) and paper-based (PB) writing assessments has been studied extensively since the emergence of computer-based language writing assessment. This study investigated the differences in the writing product and process between CB and PB modes of writing assessment in Chinese as a second language, of which the character writing system is considered challenging for learners. The many-facet Rasch model (MFRM) was adopted to reveal the text quality differences. Keystrokes and handwriting trace data were utilized to unveil insights into the writing process. The results showed that Chinese L2 learners generated higher-quality texts with fewer character mistakes in the CB mode. They revised much more, paused shorter and less frequently between lower-level linguistic units in the CB mode. The quality of CB text is associated with revision behavior, whereas pause duration serves as a stronger predictor of PB text quality. The findings suggest that the act of handwriting Chinese characters makes the construct of PB distinct from the CB writing assessment in L2 Chinese. Thus, the setting of the assessment mode should consider the target language use and the test taker’s characteristics.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
比较纸质模式和电脑模式下的中文第二语言写作成绩:从写作产品和过程的角度看问题
由于写作是一个复杂的语言生成过程,取决于写作环境和写作媒介,因此自计算机语言写作评估出现以来,人们对计算机写作评估(CB)和纸质写作评估(PB)的可比性进行了广泛的研究。汉语作为第二语言,其汉字书写系统被认为对学习者具有挑战性,本研究调查了CB和PB写作测评模式在写作产品和过程方面的差异。研究采用多面拉施模型(MFRM)来揭示文本质量差异。按键和手写痕迹数据被用来揭示写作过程。结果表明,在 CB 模式下,中国的 L2 学习者生成的文本质量更高,错误更少。在 CB 模式下,他们修改的次数更多,停顿的时间更短,低级语言单位之间的频率更低。CB 文本的质量与修改行为相关,而停顿时间则更能预测 PB 文本的质量。研究结果表明,手写汉字的行为使 PB 构建有别于汉语第二语言中的 CB 书写评估。因此,评估模式的设置应考虑目标语言的使用和受测者的特点。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Assessing Writing
Assessing Writing Multiple-
CiteScore
6.00
自引率
17.90%
发文量
67
期刊介绍: Assessing Writing is a refereed international journal providing a forum for ideas, research and practice on the assessment of written language. Assessing Writing publishes articles, book reviews, conference reports, and academic exchanges concerning writing assessments of all kinds, including traditional (direct and standardised forms of) testing of writing, alternative performance assessments (such as portfolios), workplace sampling and classroom assessment. The journal focuses on all stages of the writing assessment process, including needs evaluation, assessment creation, implementation, and validation, and test development.
期刊最新文献
A comparative study of voice in Chinese English-major undergraduates’ timed and untimed argument writing The impact of task duration on the scoring of independent writing responses of adult L2-English writers A structural equation investigation of linguistic features as indices of writing quality in assessed secondary-level EMI learners’ scientific reports Detecting and assessing AI-generated and human-produced texts: The case of second language writing teachers Validating an integrated reading-into-writing scale with trained university students
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1