This paper examines secondary education students’ multilingual writing based on two input modalities to understand the influence of input on argumentation and sourcing strategies. Participants produced texts in Basque, Spanish and English based on a video or a text, and texts were analysed to explore their production of argumentation elements and sourcing strategies. Differences were found between input modalities in the use of data and rebuttals, and in copying. Across-language differences were also found in the use of data and counterarguments, and in the use of original ideas and paraphrasing. Additionally, complex argumentation elements elicited more original ideas than simple ones. Findings suggest that different writing sub-processes might be activated when composing from different sources and that argumentation and sourcing might be transferable across languages. These results may have important implications for educators in multilingual programs who aim to support their students in acquiring academic writing skills in multiple languages.
{"title":"A comparison between input modalities and languages in source-based multilingual argumentative writing","authors":"Roberto Arias-Hermoso, Ainara Imaz Agirre, Eneritz Garro Larrañaga","doi":"10.1016/j.asw.2024.100813","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100813","url":null,"abstract":"<div><p><span>This paper examines secondary education<span> students’ multilingual writing based on two input modalities to understand the influence of input on argumentation and sourcing strategies. Participants produced texts in </span></span>Basque, Spanish and English based on a video or a text, and texts were analysed to explore their production of argumentation elements and sourcing strategies. Differences were found between input modalities in the use of data and rebuttals, and in copying. Across-language differences were also found in the use of data and counterarguments, and in the use of original ideas and paraphrasing. Additionally, complex argumentation elements elicited more original ideas than simple ones. Findings suggest that different writing sub-processes might be activated when composing from different sources and that argumentation and sourcing might be transferable across languages. These results may have important implications for educators in multilingual programs who aim to support their students in acquiring academic writing skills in multiple languages.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"60 ","pages":"Article 100813"},"PeriodicalIF":3.9,"publicationDate":"2024-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139652749","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-01-01DOI: 10.1016/j.asw.2024.100810
Jifeng Wu , Xiaofei Lu
This study examined differences in the overall noun phrase (NP) complexity and the use of specific types of NP modifiers among Korean learners of Chinese as a second language (CSL) at different proficiency levels and first language (L1) Chinese speakers. Our data consisted of a corpus of 134 narratives produced by 103 Korean CSL learners (33 beginner, 37 intermediate, and 33 advanced) and 31 L1 Chinese speakers on the same topic. Each narrative was manually analyzed using seven measures of overall NP complexity and 26 measures based on specific types of NP modifiers. Two measures of overall NP complexity (i.e., complex NP ratio and total length of all complex premodifiers) significantly discriminated all learner proficiency levels. The frequency of several types of NP modifiers consistently decreased or increased from lower to higher proficiency levels, while that of several others showed cross-proficiency fluctuations. Advanced learners demonstrated comparable levels of overall NP complexity as well as similar frequency of usage of specific types of NP modifiers as L1 Chinese speakers, with the exception that the latter used significantly more multiple modifiers. Our findings have useful implications for L2 Chinese writing syntactic complexity research and L2 Chinese writing pedagogy.
{"title":"Noun phrase complexity and second language Chinese proficiency: An analysis of narratives by Korean learners of Chinese","authors":"Jifeng Wu , Xiaofei Lu","doi":"10.1016/j.asw.2024.100810","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100810","url":null,"abstract":"<div><p>This study examined differences in the overall noun phrase (NP) complexity and the use of specific types of NP modifiers among Korean learners of Chinese as a second language (CSL) at different proficiency levels and first language (L1) Chinese speakers. Our data consisted of a corpus of 134 narratives produced by 103 Korean CSL learners (33 beginner, 37 intermediate, and 33 advanced) and 31 L1 Chinese speakers on the same topic. Each narrative was manually analyzed using seven measures of overall NP complexity and 26 measures based on specific types of NP modifiers. Two measures of overall NP complexity (i.e., complex NP ratio and total length of all complex premodifiers) significantly discriminated all learner proficiency levels. The frequency of several types of NP modifiers consistently decreased or increased from lower to higher proficiency levels, while that of several others showed cross-proficiency fluctuations. Advanced learners demonstrated comparable levels of overall NP complexity as well as similar frequency of usage of specific types of NP modifiers as L1 Chinese speakers, with the exception that the latter used significantly more multiple modifiers. Our findings have useful implications for L2 Chinese writing syntactic complexity research and L2 Chinese writing pedagogy.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"59 ","pages":"Article 100810"},"PeriodicalIF":3.9,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1075293524000035/pdfft?md5=78fe5dfebdb0aa9f7137263e1abc7b2c&pid=1-s2.0-S1075293524000035-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139652720","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-01-01DOI: 10.1016/j.asw.2024.100811
Mark Feng Teng , Maggie Ma
Student feedback literacy is essential to writing in a foreign language context. Feedback is a process through which learners become more familiar with their work quality and integrate constructive criticism to produce stronger written work. Implicit in this process is the assumption that learners should not passively receive information but actively seek, generate, process, and use feedback to apply new knowledge in current or subsequent writing tasks. Learners’ metacognitive awareness and skills may influence their participation in feedback-related activities that inform the monitoring and control of their writing process and determine their writing performance. However, little attention has been given to a systematic investigation of metacognition in feedback literacy. The present study, drawing upon a self-regulatory perspective, bridges this gap by fulfilling two purposes: (a) to measure a scale on metacognition-based student feedback literacy; and (b) to delineate the predictive effects of different components of the scale on academic writing performance. The results provided evidence for metacognitive awareness and skills in student feedback literacy and showed the predictive effects of the scale, e.g., feedback-related strategies in participation (FRSP), motivation (M), feedback-related monitoring strategies (FRMS), and strategy knowledge (SK) on EFL learners’ performance in academic writing. Relevant implications were also discussed.
{"title":"Assessing metacognition-based student feedback literacy for academic writing","authors":"Mark Feng Teng , Maggie Ma","doi":"10.1016/j.asw.2024.100811","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100811","url":null,"abstract":"<div><p>Student feedback literacy is essential to writing in a foreign language context. Feedback is a process through which learners become more familiar with their work quality and integrate constructive criticism to produce stronger written work. Implicit in this process is the assumption that learners should not passively receive information but actively seek, generate, process, and use feedback to apply new knowledge in current or subsequent writing tasks. Learners’ metacognitive awareness and skills may influence their participation in feedback-related activities that inform the monitoring and control of their writing process and determine their writing performance. However, little attention has been given to a systematic investigation of metacognition in feedback literacy. The present study, drawing upon a self-regulatory perspective, bridges this gap by fulfilling two purposes: (a) to measure a scale on metacognition-based student feedback literacy; and (b) to delineate the predictive effects of different components of the scale on academic writing performance. The results provided evidence for metacognitive awareness and skills in student feedback literacy and showed the predictive effects of the scale, e.g., feedback-related strategies in participation (FRSP), motivation (M), feedback-related monitoring strategies (FRMS), and strategy knowledge (SK) on EFL learners’ performance in academic writing. Relevant implications were also discussed.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"59 ","pages":"Article 100811"},"PeriodicalIF":3.9,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1075293524000047/pdfft?md5=bab5f2c98aefcabe1eebf93cef0f6bd3&pid=1-s2.0-S1075293524000047-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139674633","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-01-01DOI: 10.1016/j.asw.2024.100808
Solaire A. Finkenstaedt-Quinn , Field M. Watts , Ginger V. Shultz
Studies examining peer review demonstrate that students can learn from giving feedback to and receiving feedback from their peers, especially when they utilize information gained from the review process to revise. However, much of the research on peer review is situated within the literature regarding how students learn to write. With an increasing use of writing-to-learn in STEM classrooms, it is important to study how students engage in peer review for these types of writing assignments. This study sought to better understand how peer review and revision can support student learning for writing-to-learn specifically, using the lenses of cognitive perspectives of writing and engagement with written corrective feedback. Using a case study approach, we provide a detailed analysis of six students’ written artifacts in response to a writing-to-learn assignment that incorporated peer review and revision implemented in an organic chemistry course. Students demonstrated a range in the types of revisions they made and the extent to which the peer review process informed their revisions. Additionally, students exhibited surface, mid-level, and active engagement with the peer review and revision process. Considering the different engagement levels can inform how we frame peer review to students when using it as an instructional practice.
{"title":"Reading, receiving, revising: A case study on the relationship between peer review and revision in writing-to-learn","authors":"Solaire A. Finkenstaedt-Quinn , Field M. Watts , Ginger V. Shultz","doi":"10.1016/j.asw.2024.100808","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100808","url":null,"abstract":"<div><p>Studies examining peer review demonstrate that students can learn from giving feedback to and receiving feedback from their peers, especially when they utilize information gained from the review process to revise. However, much of the research on peer review is situated within the literature regarding how students learn to write. With an increasing use of writing-to-learn in STEM classrooms, it is important to study how students engage in peer review for these types of writing assignments. This study sought to better understand how peer review and revision can support student learning for writing-to-learn specifically, using the lenses of cognitive perspectives of writing and engagement with written corrective feedback. Using a case study approach, we provide a detailed analysis of six students’ written artifacts in response to a writing-to-learn assignment that incorporated peer review and revision implemented in an organic chemistry course. Students demonstrated a range in the types of revisions they made and the extent to which the peer review process informed their revisions. Additionally, students exhibited surface, mid-level, and active engagement with the peer review and revision process. Considering the different engagement levels can inform how we frame peer review to students when using it as an instructional practice.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"59 ","pages":"Article 100808"},"PeriodicalIF":3.9,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1075293524000011/pdfft?md5=9db33da6b3006af1a4e320b31a5c7c6d&pid=1-s2.0-S1075293524000011-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139505460","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-26DOI: 10.1016/j.asw.2023.100806
Nikola Dobrić
The study reported in the paper starts with a hypothesis that errors observable in writing performances can account for much of the variability of the ratings awarded to them. The assertion is that this may be the case even when prescribed rating criteria explicitly direct rater focus towards successfully performed aspects of a writing performance rather than towards errors. The hypothesis is tested on a sample of texts rated independently of the study, using a five-point analytic rating scale involving ‘Can do’-like descriptors. The correlation between errors and ratings is ascertained using ordinal logistic regression, with Pseudo R2 of 0.51 discerned overall. Thus, with roughly 50% of score variability explainable by error occurrences, the stated hypothesis is considered confirmed. The study goes on to discuss the consequences of the findings and their potential employ in assessment of writing beyond the local assessment context.
{"title":"Effects of errors on ratings of writing performances – Evidence from a high-stakes exam","authors":"Nikola Dobrić","doi":"10.1016/j.asw.2023.100806","DOIUrl":"https://doi.org/10.1016/j.asw.2023.100806","url":null,"abstract":"<div><p>The study reported in the paper starts with a hypothesis that errors observable in writing performances can account for much of the variability of the ratings awarded to them. The assertion is that this may be the case even when prescribed rating criteria explicitly direct rater focus towards successfully performed aspects of a writing performance rather than towards errors. The hypothesis is tested on a sample of texts rated independently of the study, using a five-point analytic rating scale involving ‘Can do’-like descriptors. The correlation between errors and ratings is ascertained using ordinal logistic regression, with Pseudo R<sup>2</sup> of 0.51 discerned overall. Thus, with roughly 50% of score variability explainable by error occurrences, the stated hypothesis is considered confirmed. The study goes on to discuss the consequences of the findings and their potential employ in assessment of writing beyond the local assessment context.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"59 ","pages":"Article 100806"},"PeriodicalIF":3.9,"publicationDate":"2023-12-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1075293523001149/pdfft?md5=9940df3b488b638b23d71e6a3eee3a37&pid=1-s2.0-S1075293523001149-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139050140","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-17DOI: 10.1016/j.asw.2023.100804
Hanieh Shafiee Rad , Rasoul Alipour
As a prerequisite for effective teaching and learning outcomes, assessment literacy (AL) is imperative for both writing teachers and students. Although previous research has stressed the importance of AL in effective writing instruction, few studies have designed or explored interventions that can improve teachers' and students' AL. These interventions play a crucial role in equipping writing teachers with the necessary skills to effectively comprehend, apply, interpret, and critique assessments. Likewise, AL interventions are essential for writing students, as they enhance their knowledge, attitudes, actions, and critique of assessments, enabling them to understand assessment purposes, utilize feedback constructively, and critically evaluate assessment practices. In order to address this gap in the literature, the researchers employed a mixed-method approach, which included pre- and post-tests, as well as semistructured interviews, to design and investigate effective interventions for enhancing AL. For the purpose of this study, we incorporated a four-dimensional conceptual framework for teachers' intervention and a four-phase AL framework for students' intervention as the basis for our investigation. According to the study findings, there were significant disparities between students' writing skills and their writing abilities while also substantial improvements in teachers' writing abilities following intervention. A positive perception of the intervention was also reported by both students and teachers. Thus, these interventions were able to assist students in better understanding assessment criteria and help teachers provide feedback in an effective manner. By implementing effective interventions, students, as well as teachers, will be equipped with the necessary tools to support writing instruction and ultimately achieve higher levels of achievement.
作为有效教学和学习成果的先决条件,写作教师和学生都必须具备评估素养(AL)。尽管以往的研究已经强调了评估素养在有效写作教学中的重要性,但很少有研究设计或探索能够提高教师和学生评估素养的干预措施。这些干预措施在帮助写作教师掌握有效理解、应用、解释和批判评估所需的技能方面起着至关重要的作用。同样,AL 干预措施对于写作学生来说也是至关重要的,因为这些措施可以增强他们对评估的认识、态度、行动和批判,使他们能够理解评估的目的,建设性地利用反馈,并批判性地评价评估实践。为了解决文献中的这一空白,研究人员采用了一种混合方法,包括前测和后测,以及半结构式访谈,来设计和研究提高 AL 的有效干预措施。在本研究中,我们将教师干预的四维概念框架和学生干预的四阶段 AL 框架作为调查的基础。研究结果表明,学生的写作能力与教师的写作能力之间存在着明显的差距,而教师的写作能力在干预后也有了很大的提高。学生和教师对干预措施也有积极的看法。因此,这些干预措施能够帮助学生更好地理解评价标准,并帮助教师以有效的方式提供反馈。通过实施有效的干预措施,学生和教师将掌握必要的工具来支持写作教学,并最终取得更高的成绩。
{"title":"Unlocking writing success: Building assessment literacy for students and teachers through effective interventions","authors":"Hanieh Shafiee Rad , Rasoul Alipour","doi":"10.1016/j.asw.2023.100804","DOIUrl":"10.1016/j.asw.2023.100804","url":null,"abstract":"<div><p>As a prerequisite for effective teaching and learning outcomes, assessment literacy (AL) is imperative for both writing teachers and students. Although previous research has stressed the importance of AL in effective writing instruction, few studies have designed or explored interventions that can improve teachers' and students' AL. These interventions play a crucial role in equipping writing teachers with the necessary skills to effectively comprehend, apply, interpret, and critique assessments. Likewise, AL interventions are essential for writing students, as they enhance their knowledge, attitudes, actions, and critique of assessments, enabling them to understand assessment purposes, utilize feedback constructively, and critically evaluate assessment practices. In order to address this gap in the literature, the researchers employed a mixed-method approach, which included pre- and post-tests, as well as semistructured interviews, to design and investigate effective interventions for enhancing AL. For the purpose of this study, we incorporated a four-dimensional conceptual framework for teachers' intervention and a four-phase AL framework for students' intervention as the basis for our investigation. According to the study findings, there were significant disparities between students' writing skills and their writing abilities while also substantial improvements in teachers' writing abilities following intervention. A positive perception of the intervention was also reported by both students and teachers. Thus, these interventions were able to assist students in better understanding assessment criteria and help teachers provide feedback in an effective manner. By implementing effective interventions, students, as well as teachers, will be equipped with the necessary tools to support writing instruction and ultimately achieve higher levels of achievement.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"59 ","pages":"Article 100804"},"PeriodicalIF":3.9,"publicationDate":"2023-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1075293523001125/pdfft?md5=af49e9a93aef41309bcf0e49d8949672&pid=1-s2.0-S1075293523001125-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138689637","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-15DOI: 10.1016/j.asw.2023.100803
Farshad Effatpanah , Purya Baghaei , Mohammad N. Karimi
The present study used the Mixed Rasch Model (MRM) to identify multiple profiles in L2 students’ writing with regard to several linguistic features, including content, organization, grammar, vocabulary, and mechanics. To this end, a pool of 500 essays written by English as a foreign language (EFL) students were rated by four experienced EFL teachers using the Empirically-derived Descriptor-based Diagnostic (EDD) checklist. The ratings were subjected to MRM analysis. Two distinct profiles of L2 writers emerged from the sample analyzed including: (a) Sentence-Oriented and (b) Paragraph-Oriented L2 Writers. Sentence-Oriented L2 Writers tend to focus more on linguistic features, such as grammar, vocabulary, and mechanics, at the sentence level and try to utilize these subskills to generate a written text. However, Paragraph-Oriented Writers are inclined to move beyond the boundaries of a sentence and attend to the structure of a whole paragraph using higher-order features such as content and organization subskills. The two profiles were further examined to capture their unique features. Finally, the theoretical and pedagogical implications of the identification of L2 writing profiles and suggestions for further research are discussed.
{"title":"A mixed Rasch model analysis of multiple profiles in L2 writing","authors":"Farshad Effatpanah , Purya Baghaei , Mohammad N. Karimi","doi":"10.1016/j.asw.2023.100803","DOIUrl":"10.1016/j.asw.2023.100803","url":null,"abstract":"<div><p>The present study used the Mixed Rasch Model (MRM) to identify multiple profiles in L2 students’ writing with regard to several linguistic features, including content, organization, grammar, vocabulary, and mechanics. To this end, a pool of 500 essays written by English as a foreign language (EFL) students were rated by four experienced EFL teachers using the Empirically-derived Descriptor-based Diagnostic (EDD) checklist. The ratings were subjected to MRM analysis. Two distinct profiles of L2 writers emerged from the sample analyzed including: (a) Sentence-Oriented and (b) Paragraph-Oriented L2 Writers. Sentence-Oriented L2 Writers tend to focus more on linguistic features, such as grammar, vocabulary, and mechanics, at the sentence level and try to utilize these subskills to generate a written text. However, Paragraph-Oriented Writers are inclined to move beyond the boundaries of a sentence and attend to the structure of a whole paragraph using higher-order features such as content and organization subskills. The two profiles were further examined to capture their unique features. Finally, the theoretical and pedagogical implications of the identification of L2 writing profiles and suggestions for further research are discussed.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"59 ","pages":"Article 100803"},"PeriodicalIF":3.9,"publicationDate":"2023-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1075293523001113/pdfft?md5=b03b7936ec5c972994a207399699d5c3&pid=1-s2.0-S1075293523001113-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138690118","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-14DOI: 10.1016/j.asw.2023.100802
Yiwen Cen , Yao Zheng
With the growing body of research on writing feedback and the recognition of the critical role of writing motivation, controversy has emerged over the motivational function of feedback in second language (L2) writing contexts. To provide further evidence for the impact of different feedback practices on L2 learners’ writing motivation, the present meta-analysis synthesizes the results of 13 quantitative studies on the relationship between feedback and L2 writing motivation. It examines the effect of different feedback practices on L2 learners’ writing motivation and the variables moderating the effectiveness of those feedback practices. The results show that feedback generated from multiple sources has the greatest motivational function in L2 writing, followed by single-source feedback, including peer feedback, teacher feedback, and automated feedback. Moderator analysis indicates that feedback type is a statistically significant variable moderating the effectiveness of feedback. In light of the findings, implications for L2 writing instruction and future L2 writing research are discussed.
{"title":"The motivational aspect of feedback: A meta-analysis on the effect of different feedback practices on L2 learners’ writing motivation","authors":"Yiwen Cen , Yao Zheng","doi":"10.1016/j.asw.2023.100802","DOIUrl":"10.1016/j.asw.2023.100802","url":null,"abstract":"<div><p>With the growing body of research on writing feedback and the recognition of the critical role of writing motivation, controversy has emerged over the motivational function of feedback in second language (L2) writing contexts. To provide further evidence for the impact of different feedback practices on L2 learners’ writing motivation, the present meta-analysis synthesizes the results of 13 quantitative studies on the relationship between feedback and L2 writing motivation. It examines the effect of different feedback practices on L2 learners’ writing motivation and the variables moderating the effectiveness of those feedback practices. The results show that feedback generated from multiple sources has the greatest motivational function in L2 writing, followed by single-source feedback, including peer feedback, teacher feedback, and automated feedback. Moderator analysis indicates that feedback type is a statistically significant variable moderating the effectiveness of feedback. In light of the findings, implications for L2 writing instruction and future L2 writing research are discussed.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"59 ","pages":"Article 100802"},"PeriodicalIF":3.9,"publicationDate":"2023-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1075293523001101/pdfft?md5=dc98b760619994752f4428f324d18a78&pid=1-s2.0-S1075293523001101-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138689915","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}