Pub Date : 2015-07-17DOI: 10.1558/WAP.V7I2-3.26381
Elena Cotos
This article aims to engage specialists in writing pedagogy, assessment, genre study, and educational technologies in a constructive dialog and joint exploration of automated writing analysis as a potent instantiation of computer-enhanced assessment for learning. It recounts the values of writing pedagogy and, from this perspective, examines legitimate concerns with automated writing analysis. Emphasis is placed on the need to substantiate the construct-driven debate with systematic empirical evidence that would corroborate or refute interpretations, uses, and consequences of automated scoring and feedback tools intended for specific contexts. Such evidence can be obtained by adopting a validity argument framework. To demonstrate an application of this framework, the article presents a novel genre-based approach to automated analysis configured to support research writing and provides examples of validity evidence for using it with novice scholarly writers.
{"title":"Automated Writing Analysis for writing pedagogy: From healthy tension to tangible prospects","authors":"Elena Cotos","doi":"10.1558/WAP.V7I2-3.26381","DOIUrl":"https://doi.org/10.1558/WAP.V7I2-3.26381","url":null,"abstract":"This article aims to engage specialists in writing pedagogy, assessment, genre study, and educational technologies in a constructive dialog and joint exploration of automated writing analysis as a potent instantiation of computer-enhanced assessment for learning. It recounts the values of writing pedagogy and, from this perspective, examines legitimate concerns with automated writing analysis. Emphasis is placed on the need to substantiate the construct-driven debate with systematic empirical evidence that would corroborate or refute interpretations, uses, and consequences of automated scoring and feedback tools intended for specific contexts. Such evidence can be obtained by adopting a validity argument framework. To demonstrate an application of this framework, the article presents a novel genre-based approach to automated analysis configured to support research writing and provides examples of validity evidence for using it with novice scholarly writers.","PeriodicalId":42573,"journal":{"name":"Writing & Pedagogy","volume":"8 1","pages":"197-231"},"PeriodicalIF":0.3,"publicationDate":"2015-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86575860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-07-17DOI: 10.1558/WAP.V7I2-3.18449
Jennifer I. Berne, Susan I. McMahon
New standards for writing provide the opportunity to rethink definitions of what writing is in schools. While traditional assessment methods align with many of the new standards and offer an important tool for gauging the success of some elements of writing, they often neglect other elements. In traditional assessment, the elements that are quantifiable become those that are valued. Teachers can promote consideration of other elements, those intangibles that change a text from an assignment to be completed into a powerful communicative act, by intervening in the prewriting or planning stage of the writing process. This article discusses one possible form of intervention in which the teacher has a conversation with a student that centers on the student’s investment of interest in her/his topic and helps the student plan a paper that will make a unique contribution and not just fulfill a task. By using a prewriting rubric to focus the conversation, the teacher is able to track student progress in understanding and enacting this important component of writing.
{"title":"New Standards and Opportunities: Rethinking Good Writing in Schools","authors":"Jennifer I. Berne, Susan I. McMahon","doi":"10.1558/WAP.V7I2-3.18449","DOIUrl":"https://doi.org/10.1558/WAP.V7I2-3.18449","url":null,"abstract":"New standards for writing provide the opportunity to rethink definitions of what writing is in schools. While traditional assessment methods align with many of the new standards and offer an important tool for gauging the success of some elements of writing, they often neglect other elements. In traditional assessment, the elements that are quantifiable become those that are valued. Teachers can promote consideration of other elements, those intangibles that change a text from an assignment to be completed into a powerful communicative act, by intervening in the prewriting or planning stage of the writing process. This article discusses one possible form of intervention in which the teacher has a conversation with a student that centers on the student’s investment of interest in her/his topic and helps the student plan a paper that will make a unique contribution and not just fulfill a task. By using a prewriting rubric to focus the conversation, the teacher is able to track student progress in understanding and enacting this important component of writing.","PeriodicalId":42573,"journal":{"name":"Writing & Pedagogy","volume":"41 1","pages":"377-394"},"PeriodicalIF":0.3,"publicationDate":"2015-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88721771","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-07-17DOI: 10.1558/WAP.V7I2-3.26045
Heike Neumann
Information management of discourse – the ability of a writer to use linguistic forms to organize and present information in a written text – is a key component of second language (L2) ability models in the language assessment literature (e.g., Canale & Swain, 1980; Weigle, 2002), but Purpura’s (2004) language ability model developed specifically for assessment purposes is the only one that considers it to be part of the ability to use grammar accurately and meaningfully when producing a text in an L2. The current study investigated whether L2 academic writing teachers consider information management of discourse as an assessment criterion when assessing grammar in L2 academic texts. Fourteen students in an academic English as a second language writing course at an English-medium university in Canada and their teacher participated in this case study. Students’ essay exam scripts were collected, and the Theme-Rheme progression (TRP) patterns and links (Danes, 1974) as well as the distribution of new and given information (Halliday & Matthiessen, 2004) in these essays were analyzed. Pearson correlation coefficients between the teacher-assigned grammar grade and the results from the TRP and information distribution analyses were calculated. The findings indicate that information management of discourse indeed forms part of the assessment criteria for grammar in academic writing for the teacher in this study. The implications of this finding for L2 writing pedagogy are discussed.
{"title":"The Role of Information Management in the Assessment of Grammar in L2 Academic Writing: An Exploratory Case Study","authors":"Heike Neumann","doi":"10.1558/WAP.V7I2-3.26045","DOIUrl":"https://doi.org/10.1558/WAP.V7I2-3.26045","url":null,"abstract":"Information management of discourse – the ability of a writer to use linguistic forms to organize and present information in a written text – is a key component of second language (L2) ability models in the language assessment literature (e.g., Canale & Swain, 1980; Weigle, 2002), but Purpura’s (2004) language ability model developed specifically for assessment purposes is the only one that considers it to be part of the ability to use grammar accurately and meaningfully when producing a text in an L2. The current study investigated whether L2 academic writing teachers consider information management of discourse as an assessment criterion when assessing grammar in L2 academic texts. Fourteen students in an academic English as a second language writing course at an English-medium university in Canada and their teacher participated in this case study. Students’ essay exam scripts were collected, and the Theme-Rheme progression (TRP) patterns and links (Danes, 1974) as well as the distribution of new and given information (Halliday & Matthiessen, 2004) in these essays were analyzed. Pearson correlation coefficients between the teacher-assigned grammar grade and the results from the TRP and information distribution analyses were calculated. The findings indicate that information management of discourse indeed forms part of the assessment criteria for grammar in academic writing for the teacher in this study. The implications of this finding for L2 writing pedagogy are discussed.","PeriodicalId":42573,"journal":{"name":"Writing & Pedagogy","volume":"48 1","pages":"329-354"},"PeriodicalIF":0.3,"publicationDate":"2015-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88874207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-07-17DOI: 10.1558/WAP.V7I2-3.27587
Dan Melzer
Reviews of: Very like a whale: The assessment of writing programs Edward M. White, Norbert Elliot, and Irvin Peckham (2015) ISBN-13: 978-0-87421-985-2. Pp. 202. Assessing and improving student writing in college Barbara E. Walvoord (2014) ISBN-13: 978-1-118-55736-5. Pp. xiii + 119.
Edward M. White, Norbert Elliot, and Irvin Peckham (2015) ISBN-13: 978-0-87421-985-2。202页。Barbara E. Walvoord (2014) ISBN-13: 978-1-118-55736-5。第13 + 119页。
{"title":"Approaches to Assessing Student Writing and Writing Programs in the Age of Accountability","authors":"Dan Melzer","doi":"10.1558/WAP.V7I2-3.27587","DOIUrl":"https://doi.org/10.1558/WAP.V7I2-3.27587","url":null,"abstract":"Reviews of: \u0000 \u0000 Very like a whale: The assessment of writing programs Edward M. White, Norbert Elliot, and Irvin Peckham (2015) ISBN-13: 978-0-87421-985-2. Pp. 202. \u0000 \u0000 Assessing and improving student writing in college Barbara E. Walvoord (2014) ISBN-13: 978-1-118-55736-5. Pp. xiii + 119.","PeriodicalId":42573,"journal":{"name":"Writing & Pedagogy","volume":"31 1","pages":"423-428"},"PeriodicalIF":0.3,"publicationDate":"2015-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73074552","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-07-17DOI: 10.1558/WAP.V7I2-3.26227
Y. Moore
Drawing on Shaw and Weir’s theoretical framework for validating writing tests (2007), this paper highlights the issues of the writing constructs measured in English writing tests in university entrance exams, and recommends improvements. The paper analysed the writing response formats of 66 English tests used by Japanese universities and one English test of National Centre Exams (NCE) for 2013 entry. It was found that translation was the most commonly used skill in the writing tests, and accounts for 45% of the total. The most common writing response format used by the state universities was translation, whereas word-reordering was commonly in use at the private universities and NCE. Because word-reordering and translation tasks can assess very limited English grammatical and lexical discrete writing skills, there is no conclusive proof that the task can assess writing skills needed by the applicants to write cohesive texts in English. However, there are potential reasons why indirect writing assessments have remained a key method for Japanese university admission in the system of designing the English tests: the number of applicants and time constraints. Taking these factors into account, alternative English tests should be introduced to Japanese university entrance examinations.
{"title":"An Evaluation of English Writing Assessment in Japanese University Entrance Examinations","authors":"Y. Moore","doi":"10.1558/WAP.V7I2-3.26227","DOIUrl":"https://doi.org/10.1558/WAP.V7I2-3.26227","url":null,"abstract":"Drawing on Shaw and Weir’s theoretical framework for validating writing tests (2007), this paper highlights the issues of the writing constructs measured in English writing tests in university entrance exams, and recommends improvements. The paper analysed the writing response formats of 66 English tests used by Japanese universities and one English test of National Centre Exams (NCE) for 2013 entry. It was found that translation was the most commonly used skill in the writing tests, and accounts for 45% of the total. The most common writing response format used by the state universities was translation, whereas word-reordering was commonly in use at the private universities and NCE. Because word-reordering and translation tasks can assess very limited English grammatical and lexical discrete writing skills, there is no conclusive proof that the task can assess writing skills needed by the applicants to write cohesive texts in English. However, there are potential reasons why indirect writing assessments have remained a key method for Japanese university admission in the system of designing the English tests: the number of applicants and time constraints. Taking these factors into account, alternative English tests should be introduced to Japanese university entrance examinations.","PeriodicalId":42573,"journal":{"name":"Writing & Pedagogy","volume":"139 1","pages":"233-260"},"PeriodicalIF":0.3,"publicationDate":"2015-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81473231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-07-17DOI: 10.1558/wap.v7i2-3.26461
A. Mahboob
This paper, building on results from a large online embedded language and literacy development project, introduces the notions of ‘cohesion’ and ‘coherence’ in feedback and outlines steps that instructors can take to provide such feedback in their own contexts. Cohesion in feedback can be defined in terms of its goals, audience, and organisation; and coherence in terms of how instances of feedback work together to scaffold a student into developing a deeper understanding of issues in their writing. The paper argues that feedback which is cohesive and coherent is not a collection of reactions to student’s errors/mistakes, but it is a thoughtfully and carefully drafted text which responds to a student’s writing based on an assessment of their needs. The paper includes an evaluation of how students respond to such feedback by sharing examples of students’ drafts, the feedback they received, and their responses to the feedback. This paper helps us in understanding the nature of feedback as well as understanding how to apply it with the goal of making our students stronger, more independent, and self-regulating writers.
{"title":"Understanding and Providing ‘Cohesive’ and ‘Coherent’ Feedback on Writing","authors":"A. Mahboob","doi":"10.1558/wap.v7i2-3.26461","DOIUrl":"https://doi.org/10.1558/wap.v7i2-3.26461","url":null,"abstract":"This paper, building on results from a large online embedded language and literacy development project, introduces the notions of ‘cohesion’ and ‘coherence’ in feedback and outlines steps that instructors can take to provide such feedback in their own contexts. Cohesion in feedback can be defined in terms of its goals, audience, and organisation; and coherence in terms of how instances of feedback work together to scaffold a student into developing a deeper understanding of issues in their writing. The paper argues that feedback which is cohesive and coherent is not a collection of reactions to student’s errors/mistakes, but it is a thoughtfully and carefully drafted text which responds to a student’s writing based on an assessment of their needs. The paper includes an evaluation of how students respond to such feedback by sharing examples of students’ drafts, the feedback they received, and their responses to the feedback. This paper helps us in understanding the nature of feedback as well as understanding how to apply it with the goal of making our students stronger, more independent, and self-regulating writers.","PeriodicalId":42573,"journal":{"name":"Writing & Pedagogy","volume":"16 1","pages":"355-376"},"PeriodicalIF":0.3,"publicationDate":"2015-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77190388","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-01-07DOI: 10.1558/WAP.V7I2-3.26457
Dina Tsagari, Eleni Meletiadou
Peer assessment (PA), a process by which students' work (oral or written) is assessed by other students of equal status, has received a lot of attention recently (Assessment Reform Group, 1999, 2002). Using data collected from secondary schools in Cyprus, the current study investigates whether PA can improve the writing skills of adolescent students of English as a foreign language (EFL). The results showed that PA did have a positive impact on students’ writing performance, especially for students who provided peer assessment. The article discusses the important role of PA in the development of students’ writing skills and offers recommendations for the implementation of PA in EFL contexts.
{"title":"Peer Assessment of Adolescent Learners’ Writing Performance","authors":"Dina Tsagari, Eleni Meletiadou","doi":"10.1558/WAP.V7I2-3.26457","DOIUrl":"https://doi.org/10.1558/WAP.V7I2-3.26457","url":null,"abstract":"Peer assessment (PA), a process by which students' work (oral or written) is assessed by other students of equal status, has received a lot of attention recently (Assessment Reform Group, 1999, 2002). Using data collected from secondary schools in Cyprus, the current study investigates whether PA can improve the writing skills of adolescent students of English as a foreign language (EFL). The results showed that PA did have a positive impact on students’ writing performance, especially for students who provided peer assessment. The article discusses the important role of PA in the development of students’ writing skills and offers recommendations for the implementation of PA in EFL contexts.","PeriodicalId":42573,"journal":{"name":"Writing & Pedagogy","volume":"42 1","pages":"305-328"},"PeriodicalIF":0.3,"publicationDate":"2015-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76800212","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article is based on the idea that there is latent storytelling already in proposals. It explores the various ways in which storytelling functions as a pedagogical model of teaching the writing of proposals in business and technical writing courses. The central premise is that stories, like proposals, are forms of discourse that place events sequentially from beginning to end with meaningful and graspable connections in between. Stories take (identified) audiences into account by being selective of events that are carefully rearranged and described through composites of scenarios and characters. This article explores those storytelling patterns in theory and in practice. It aims to enhance the perspective of teaching proposal writing by calling attention to a seemingly inconsequential or unrelated notion – storytelling.
{"title":"The Art of Storytelling: A Pedagogy for Proposal Writing","authors":"Josephine N. Walwema","doi":"10.1558/WAP.V7I1.26246","DOIUrl":"https://doi.org/10.1558/WAP.V7I1.26246","url":null,"abstract":"This article is based on the idea that there is latent storytelling already in proposals. It explores the various ways in which storytelling functions as a pedagogical model of teaching the writing of proposals in business and technical writing courses. The central premise is that stories, like proposals, are forms of discourse that place events sequentially from beginning to end with meaningful and graspable connections in between. Stories take (identified) audiences into account by being selective of events that are carefully rearranged and described through composites of scenarios and characters. This article explores those storytelling patterns in theory and in practice. It aims to enhance the perspective of teaching proposal writing by calling attention to a seemingly inconsequential or unrelated notion – storytelling.","PeriodicalId":42573,"journal":{"name":"Writing & Pedagogy","volume":"17 1","pages":"15-38"},"PeriodicalIF":0.3,"publicationDate":"2015-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81550004","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Genre analysis has become an important tool for teaching writing across the disciplines to non-native English-speaking (EL2) and native English-speaking (EL1) graduate students alike. Since the pressing needs of EL2 graduate students have meant that educators often teach them in separate classes, and since genre-based research into teaching higher-level writing has been largely generated in fields such as English for Academic Purposes, we have an insufficient understanding of whether this instructional mode plays out similarly in EL1 and EL2 classrooms. Launching a genre-based course on writing research articles in parallel sections for EL1 and EL2 graduate students provided an opportunity to address this knowledge shortfall. This article qualitatively examines the different classroom behaviors observed in each version of the course when a common curriculum was used and specifically explores three key themes: initial receptivity, nature of student engagement, and overall assessment. Our study shows that although EL2 and EL1 learners have similar needs, the obstacles to their benefitting from genre-based instruction are different; EL2 students must learn to identify themselves as needing writing support that transcends linguistic matters, while EL1 students must learn to identify themselves as needing writing support despite their linguistic competence. Providing the same mode of instruction can benefit both populations as long as educators are sensitive to the specific challenges each population presents in the classroom. The insights gained contribute to the scholarship on genre-based teaching and offer ways of better meeting the needs of EL1 and EL2 students alike.
{"title":"Graduate Student Writers: Assessing Needs across the “Linguistic Divide”","authors":"Peter F. Grav, R. Cayley","doi":"10.1558/WAP.V7I1.17236","DOIUrl":"https://doi.org/10.1558/WAP.V7I1.17236","url":null,"abstract":"Genre analysis has become an important tool for teaching writing across the disciplines to non-native English-speaking (EL2) and native English-speaking (EL1) graduate students alike. Since the pressing needs of EL2 graduate students have meant that educators often teach them in separate classes, and since genre-based research into teaching higher-level writing has been largely generated in fields such as English for Academic Purposes, we have an insufficient understanding of whether this instructional mode plays out similarly in EL1 and EL2 classrooms. Launching a genre-based course on writing research articles in parallel sections for EL1 and EL2 graduate students provided an opportunity to address this knowledge shortfall. This article qualitatively examines the different classroom behaviors observed in each version of the course when a common curriculum was used and specifically explores three key themes: initial receptivity, nature of student engagement, and overall assessment. Our study shows that although EL2 and EL1 learners have similar needs, the obstacles to their benefitting from genre-based instruction are different; EL2 students must learn to identify themselves as needing writing support that transcends linguistic matters, while EL1 students must learn to identify themselves as needing writing support despite their linguistic competence. Providing the same mode of instruction can benefit both populations as long as educators are sensitive to the specific challenges each population presents in the classroom. The insights gained contribute to the scholarship on genre-based teaching and offer ways of better meeting the needs of EL1 and EL2 students alike.","PeriodicalId":42573,"journal":{"name":"Writing & Pedagogy","volume":"2 1","pages":"69-93"},"PeriodicalIF":0.3,"publicationDate":"2015-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82103455","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Writing scholars often note the heterogeneity of the second language (L2) student population in higher education writing courses, but only recently have researchers begun to carefully examine differences in the writing ability of international L2 learners and U.S. resident L2 learners. Most of the empirical research to date focuses on the two groups’ grammatical accuracy to the exclusion of other dimensions of writing ability. Such a limited focus not only underrepresents the multifaceted construct of writing ability, but also overlooks potential areas where noticeable differences across the two groups’ writing ability might surface. Although arguably less salient than grammatical (in)accuracy, and not as prevalent in scoring rubrics, students’ use of sociopragmatic features in writing offers an alternative approach for comparing the two groups of learners beyond their use of grammatical forms. Thus, the current study describes and compares how international and U.S. resident L2 learners used certain sociopragmatic markers in their writing. By focusing on the meanings associated with these markers, the study suggests that students’ use of such markers reflects their sociopragmatic awareness. Findings indicate that the two groups of writers may be more similar than different, contrary to previous research.
{"title":"What Do They Mean? Comparing International and U.S. Resident Second Language Students’ Use of Sociopragmatic Markers in Writing","authors":"Kristen di Gennaro","doi":"10.1558/WAP.V7I1.24054","DOIUrl":"https://doi.org/10.1558/WAP.V7I1.24054","url":null,"abstract":"Writing scholars often note the heterogeneity of the second language (L2) student population in higher education writing courses, but only recently have researchers begun to carefully examine differences in the writing ability of international L2 learners and U.S. resident L2 learners. Most of the empirical research to date focuses on the two groups’ grammatical accuracy to the exclusion of other dimensions of writing ability. Such a limited focus not only underrepresents the multifaceted construct of writing ability, but also overlooks potential areas where noticeable differences across the two groups’ writing ability might surface. Although arguably less salient than grammatical (in)accuracy, and not as prevalent in scoring rubrics, students’ use of sociopragmatic features in writing offers an alternative approach for comparing the two groups of learners beyond their use of grammatical forms. Thus, the current study describes and compares how international and U.S. resident L2 learners used certain sociopragmatic markers in their writing. By focusing on the meanings associated with these markers, the study suggests that students’ use of such markers reflects their sociopragmatic awareness. Findings indicate that the two groups of writers may be more similar than different, contrary to previous research.","PeriodicalId":42573,"journal":{"name":"Writing & Pedagogy","volume":"1 1","pages":"39-67"},"PeriodicalIF":0.3,"publicationDate":"2015-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88711346","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}