Long Quoc Nguyen , Bao Trang Thi Nguyen , Hoang Yen Phuong
{"title":"Exploring the use of model texts as a feedback instrument in expository writing: EFL learners’ noticing, incorporations, and text quality","authors":"Long Quoc Nguyen , Bao Trang Thi Nguyen , Hoang Yen Phuong","doi":"10.1016/j.asw.2024.100890","DOIUrl":null,"url":null,"abstract":"<div><div>Model texts as a feedback instrument (MTFI) have proven effective in enhancing L2 writing, yet research on this domain mainly focused on narrative compositions over a three-stage task: i) composing, ii) comparing, and iii) rewriting. The impact of MTFI on learners’ noticing, incorporations, and text quality in expository writing, especially in the Vietnamese context, remains underexplored. To address these gaps, this study aims to investigate the effect of MTFI on 68 Vietnamese EFL undergraduates’ expository writing following a process-product approach. The participants were divided into a control group (CG, <em>N</em> = 33) and an experimental group (EG, <em>N</em> = 35). Both groups attended stages one and three, but only the EG compared their initial writing with a model text in stage two. The results, derived from learners’ note-taking sheets, written paragraphs, and semi-structured interviews, revealed that despite the two groups’ comparability in stage one, the EG demonstrated significantly better text quality than the CG in stage three, particularly in content, lexis, and organization. Furthermore, while the EG largely encountered lexical issues at the outset, they primarily concentrated on content-related and organizational features in the subsequent stages. Based on the findings, recommendations for future research and implications for pedagogy were deliberated.</div></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"62 ","pages":"Article 100890"},"PeriodicalIF":4.2000,"publicationDate":"2024-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessing Writing","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1075293524000837","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
Model texts as a feedback instrument (MTFI) have proven effective in enhancing L2 writing, yet research on this domain mainly focused on narrative compositions over a three-stage task: i) composing, ii) comparing, and iii) rewriting. The impact of MTFI on learners’ noticing, incorporations, and text quality in expository writing, especially in the Vietnamese context, remains underexplored. To address these gaps, this study aims to investigate the effect of MTFI on 68 Vietnamese EFL undergraduates’ expository writing following a process-product approach. The participants were divided into a control group (CG, N = 33) and an experimental group (EG, N = 35). Both groups attended stages one and three, but only the EG compared their initial writing with a model text in stage two. The results, derived from learners’ note-taking sheets, written paragraphs, and semi-structured interviews, revealed that despite the two groups’ comparability in stage one, the EG demonstrated significantly better text quality than the CG in stage three, particularly in content, lexis, and organization. Furthermore, while the EG largely encountered lexical issues at the outset, they primarily concentrated on content-related and organizational features in the subsequent stages. Based on the findings, recommendations for future research and implications for pedagogy were deliberated.
期刊介绍:
Assessing Writing is a refereed international journal providing a forum for ideas, research and practice on the assessment of written language. Assessing Writing publishes articles, book reviews, conference reports, and academic exchanges concerning writing assessments of all kinds, including traditional (direct and standardised forms of) testing of writing, alternative performance assessments (such as portfolios), workplace sampling and classroom assessment. The journal focuses on all stages of the writing assessment process, including needs evaluation, assessment creation, implementation, and validation, and test development.