Pub Date : 2021-05-01DOI: 10.17239/JOWR-2021.13.01.05
L. Verheijen, W. Spooren
Today’s youths are continuously engaged with social media. The informal language they use in computer-mediated communication (CMC) often deviates from spelling and grammar rules of the standard language. Therefore, parents and teachers fear that social media have a negative impact on youths’ literacy skills. This paper examines whether such worries are justifiable. An experimental study was conducted with 500 Dutch youths of different educational levels and age groups, to find out if social media affect their productive or perceptive writing skills. We measured whether chatting via WhatsApp directly impacts the writing quality of Dutch youths’ narratives or their ability to detect ‘spelling errors’ (deviations from Standard Dutch) in grammaticality judgement tasks. The use of WhatsApp turned out to have no short-term effects on participants’ performances on either of the writing tasks. Thus, the present study gives no cause for great concern about any impact of WhatsApp on youths’ school writing.
{"title":"The impact of WhatsApp on Dutch youths’ school writing and spelling","authors":"L. Verheijen, W. Spooren","doi":"10.17239/JOWR-2021.13.01.05","DOIUrl":"https://doi.org/10.17239/JOWR-2021.13.01.05","url":null,"abstract":"Today’s youths are continuously engaged with social media. The informal language they use in computer-mediated communication (CMC) often deviates from spelling and grammar rules of the standard language. Therefore, parents and teachers fear that social media have a negative impact on youths’ literacy skills. This paper examines whether such worries are justifiable. An experimental study was conducted with 500 Dutch youths of different educational levels and age groups, to find out if social media affect their productive or perceptive writing skills. We measured whether chatting via WhatsApp directly impacts the writing quality of Dutch youths’ narratives or their ability to detect ‘spelling errors’ (deviations from Standard Dutch) in grammaticality judgement tasks. The use of WhatsApp turned out to have no short-term effects on participants’ performances on either of the writing tasks. Thus, the present study gives no cause for great concern about any impact of WhatsApp on youths’ school writing.","PeriodicalId":45632,"journal":{"name":"Journal of Writing Research","volume":" ","pages":""},"PeriodicalIF":4.1,"publicationDate":"2021-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49520977","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-02-22DOI: 10.17239/JOWR-2021.13.01.04
L. Waes, M. Leijten, J. Roeser, T. Olive, J. Grabowski
In keyboard writing, typing skills are considered an important prerequisite of proficient text production. We describe the design, implementation, and application of a standardized copy-typing task in order to measure and assess individual typing fluency. A test-retest analysis indicates the instrument’s reliability. While the task has been developed across eleven different languages and the related keyboard layouts, we here refer to a corpus of Dutch copy tasks (n = 1682). Analyses show that copying speed non-linearly varies with age. Bayesian analyses reveal differences in the typing performance and the underlying distributions of inter-key intervals between the different task components (e.g., lexical vs. non-lexical materials; high-frequent vs. lowfrequent bigrams). Based on these findings it is strongly recommended to include copy-task measures in the analysis of keystroke logging data in writing studies. This supports a better comparability and interpretability of keystroke data from more complex or communicatively-embedded writing tasks across individuals. Further potential applications of the copy task for writing research are explained and discussed.
{"title":"Measuring and Assessing Typing Skills in Writing Research","authors":"L. Waes, M. Leijten, J. Roeser, T. Olive, J. Grabowski","doi":"10.17239/JOWR-2021.13.01.04","DOIUrl":"https://doi.org/10.17239/JOWR-2021.13.01.04","url":null,"abstract":"In keyboard writing, typing skills are considered an important prerequisite of proficient text production. We describe the design, implementation, and application of a standardized copy-typing task in order to measure and assess individual typing fluency. A test-retest analysis indicates the instrument’s reliability. While the task has been developed across eleven different languages and the related keyboard layouts, we here refer to a corpus of Dutch copy tasks (n = 1682). Analyses show that copying speed non-linearly varies with age. Bayesian analyses reveal differences in the typing performance and the underlying distributions of inter-key intervals between the different task components (e.g., lexical vs. non-lexical materials; high-frequent vs. lowfrequent bigrams). Based on these findings it is strongly recommended to include copy-task measures in the analysis of keystroke logging data in writing studies. This supports a better comparability and interpretability of keystroke data from more complex or communicatively-embedded writing tasks across individuals. Further potential applications of the copy task for writing research are explained and discussed.","PeriodicalId":45632,"journal":{"name":"Journal of Writing Research","volume":" ","pages":""},"PeriodicalIF":4.1,"publicationDate":"2021-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46352093","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-01-01DOI: 10.17239/jowr-2021.13.02.02
E. Dux Speltz, E. Chukharev-Hudilainen
This article presents a new intervention for improving first-language writing fluency and reports an empirical study investigating the effects of this intervention on process and product measures of writing. The intervention explicitly encourages fluent text production by providing automated real-time feedback to the writer. Participants were twenty native-English-speaking undergraduate students at a large Midwestern university in the United States, all of whom were proficient writers. Each participant composed two texts (one in each of the control and the intervention condition) in an online text editor with embedded keystroke logging capabilities. Quantitative data consisted of product and process measures obtained from texts produced by participants in the control and the intervention condition, and qualitative data included participants’ responses to an openended questionnaire. Linear mixed-effects regression models were fit to the quantitative data to assess differences between conditions. Findings demonstrated that there were significant differences between the intervention and the control condition in terms of both the product and the process of writing. Specifically, participants wrote more text, expressed more ideas, and produced higher-quality texts in the fluency-focused intervention condition. Qualitative findings from questionnaire responses are also discussed.
{"title":"The effect of automated fluency-focused feedback on text production","authors":"E. Dux Speltz, E. Chukharev-Hudilainen","doi":"10.17239/jowr-2021.13.02.02","DOIUrl":"https://doi.org/10.17239/jowr-2021.13.02.02","url":null,"abstract":"This article presents a new intervention for improving first-language writing fluency and reports an empirical study investigating the effects of this intervention on process and product measures of writing. The intervention explicitly encourages fluent text production by providing automated real-time feedback to the writer. Participants were twenty native-English-speaking undergraduate students at a large Midwestern university in the United States, all of whom were proficient writers. Each participant composed two texts (one in each of the control and the intervention condition) in an online text editor with embedded keystroke logging capabilities. Quantitative data consisted of product and process measures obtained from texts produced by participants in the control and the intervention condition, and qualitative data included participants’ responses to an openended questionnaire. Linear mixed-effects regression models were fit to the quantitative data to assess differences between conditions. Findings demonstrated that there were significant differences between the intervention and the control condition in terms of both the product and the process of writing. Specifically, participants wrote more text, expressed more ideas, and produced higher-quality texts in the fluency-focused intervention condition. Qualitative findings from questionnaire responses are also discussed.","PeriodicalId":45632,"journal":{"name":"Journal of Writing Research","volume":"1 1","pages":""},"PeriodicalIF":4.1,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67444494","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-01-01DOI: 10.17239/jowr-2021.13.02.04
Chandra Alston, Chauncey Monte-Sano, Mary J. Schleppegrell, Kimberly Harn
Modeling, by demonstrating and explaining the cognitive processes involved in writing, has been shown to support writing development. Less often have specific disciplinary aspects of teaching with models been investigated. We draw on research in English Language Arts and apply it in social studies inquiry contexts to propose a framework for teaching models of thinking and writing that offers teachers and researchers new perspectives on the discipline-specific work of modeling. This framework accounts for three modes of instruction – use of models (a tool or a text), demonstrating and explaining, and co-constructing model texts with students – and describes eleven instructional practices that support instruction across these modes. We analyze data from three years of social studies instruction to show how two teachers enact these practices across the three modes to highlight the disciplinary thinking and processes that support writing social studies arguments with sources, highlighting the ways students can actively participate in teaching writing with models. In addition, we consider the role of the curriculum in this work. We show how writing instruction can address disciplinary ways of thinking in social studies and illustrate the potential of the framework for guiding researchers’ and practitioners’ work on writing instruction across disciplinary contexts.
{"title":"Teaching models of disciplinary argumentation in middle school social studies: A framework for supporting writing development","authors":"Chandra Alston, Chauncey Monte-Sano, Mary J. Schleppegrell, Kimberly Harn","doi":"10.17239/jowr-2021.13.02.04","DOIUrl":"https://doi.org/10.17239/jowr-2021.13.02.04","url":null,"abstract":"Modeling, by demonstrating and explaining the cognitive processes involved in writing, has been shown to support writing development. Less often have specific disciplinary aspects of teaching with models been investigated. We draw on research in English Language Arts and apply it in social studies inquiry contexts to propose a framework for teaching models of thinking and writing that offers teachers and researchers new perspectives on the discipline-specific work of modeling. This framework accounts for three modes of instruction – use of models (a tool or a text), demonstrating and explaining, and co-constructing model texts with students – and describes eleven instructional practices that support instruction across these modes. We analyze data from three years of social studies instruction to show how two teachers enact these practices across the three modes to highlight the disciplinary thinking and processes that support writing social studies arguments with sources, highlighting the ways students can actively participate in teaching writing with models. In addition, we consider the role of the curriculum in this work. We show how writing instruction can address disciplinary ways of thinking in social studies and illustrate the potential of the framework for guiding researchers’ and practitioners’ work on writing instruction across disciplinary contexts.","PeriodicalId":45632,"journal":{"name":"Journal of Writing Research","volume":"1 1","pages":""},"PeriodicalIF":4.1,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67444819","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-01DOI: 10.17239/jowr-2020.12.01.03
L. O’Rourke, V. Connelly, A. Barnett, O. Afonso
{"title":"Spellcheck has a positive impact on spelling accuracy and might improve lexical diversity in essays written by students with dyslexia.","authors":"L. O’Rourke, V. Connelly, A. Barnett, O. Afonso","doi":"10.17239/jowr-2020.12.01.03","DOIUrl":"https://doi.org/10.17239/jowr-2020.12.01.03","url":null,"abstract":"","PeriodicalId":45632,"journal":{"name":"Journal of Writing Research","volume":" ","pages":""},"PeriodicalIF":4.1,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48315565","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-01DOI: 10.17239/jowr-2020.12.01.05
N. Vandermeulen, M. Leijten, L. Waes
Keystroke loggers facilitate researchers to collect fine-grained process data and offer support in analyzing these data. Keystroke logging has become popular in writing research, and study by study we are now paving the path to a better understanding of writing process data. However, few researchers have concentrated on how to bring keystroke logging to the classroom. Not because they are not convinced that writing development could benefit from a more process-oriented pedagogy, but because 'translating' complex and large data sets to an educational context is challenging. Therefore, we have developed a new function in Inputlog, specifically aiming to facilitate writing tutors in providing process feedback to their students. Based on an XMLlogfile, the so-called 'report' function automatically generates a pdf-file addressing different perspectives of the writing process: pausing, revision, source use, and fluency. These perspectives are reported either quantitatively or visually. Brief introductory texts explain the information presented. Inputlog provides a default feedback report, but users can also customize the report. This paper describes the process report and demonstrates the use of it in an intervention. We also present some additional pedagogical scenarios to actively use this type of feedback in writing classes.
{"title":"Reporting Writing Process Feedback in the Classroom. Using Keystroke Logging Data to Reflect on Writing Processes","authors":"N. Vandermeulen, M. Leijten, L. Waes","doi":"10.17239/jowr-2020.12.01.05","DOIUrl":"https://doi.org/10.17239/jowr-2020.12.01.05","url":null,"abstract":"Keystroke loggers facilitate researchers to collect fine-grained process data and offer support in analyzing these data. Keystroke logging has become popular in writing research, and study by study we are now paving the path to a better understanding of writing process data. However, few researchers have concentrated on how to bring keystroke logging to the classroom. Not because they are not convinced that writing development could benefit from a more process-oriented pedagogy, but because 'translating' complex and large data sets to an educational context is challenging. Therefore, we have developed a new function in Inputlog, specifically aiming to facilitate writing tutors in providing process feedback to their students. Based on an XMLlogfile, the so-called 'report' function automatically generates a pdf-file addressing different perspectives of the writing process: pausing, revision, source use, and fluency. These perspectives are reported either quantitatively or visually. Brief introductory texts explain the information presented. Inputlog provides a default feedback report, but users can also customize the report. This paper describes the process report and demonstrates the use of it in an intervention. We also present some additional pedagogical scenarios to actively use this type of feedback in writing classes.","PeriodicalId":45632,"journal":{"name":"Journal of Writing Research","volume":" ","pages":""},"PeriodicalIF":4.1,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48825468","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-01DOI: 10.17239/jowr-2020.12.01.07
Elena Cotos, S. Huffman, Stephanie Link
Teaching the craft of written science communication is an arduous task that requires familiarity with disciplinary writing conventions. With the burgeoning of technological advancements, practitioners preparing novice research writers can begin to augment teaching and learning with activities in digital writing environments attuned to the conventions of scientific writing in the disciplines. The Research Writing Tutor (RWT) is one such technology. Grounded in an integrative theoretical framework, it was designed to help students acquire knowledge about the research article genre and develop research writing competence. One of its modules was designed to facilitate revision by providing different forms of automated feedback and scaffolding that are genre-based and discipline-specific. This study explores whether and how the features of the RWT may impact revision while using this module of the tool. Drawing from cognitive writing modeling, this study investigates the behaviors of a multidisciplinary group of 11 graduate-student writers by exploring how they interacted with the RWT’s features and how this interaction may create conditions for enhanced revision processes and text modifications. Findings demonstrate promising potential for the use of this automated feedback tool in fostering writers’ metacognitive processing during revision. This research adds to theory on cognitive writing models by acknowledging the evolving role of digital environments in writing practices and offering insights into future development of automated tools for genre-based writing instruction.
{"title":"Understanding Graduate Writers’ Interaction with and Impact of the Research Writing Tutor during Revision","authors":"Elena Cotos, S. Huffman, Stephanie Link","doi":"10.17239/jowr-2020.12.01.07","DOIUrl":"https://doi.org/10.17239/jowr-2020.12.01.07","url":null,"abstract":"Teaching the craft of written science communication is an arduous task that requires familiarity with disciplinary writing conventions. With the burgeoning of technological advancements, practitioners preparing novice research writers can begin to augment teaching and learning with activities in digital writing environments attuned to the conventions of scientific writing in the disciplines. The Research Writing Tutor (RWT) is one such technology. Grounded in an integrative theoretical framework, it was designed to help students acquire knowledge about the research article genre and develop research writing competence. One of its modules was designed to facilitate revision by providing different forms of automated feedback and scaffolding that are genre-based and discipline-specific. This study explores whether and how the features of the RWT may impact revision while using this module of the tool. Drawing from cognitive writing modeling, this study investigates the behaviors of a multidisciplinary group of 11 graduate-student writers by exploring how they interacted with the RWT’s features and how this interaction may create conditions for enhanced revision processes and text modifications. Findings demonstrate promising potential for the use of this automated feedback tool in fostering writers’ metacognitive processing during revision. This research adds to theory on cognitive writing models by acknowledging the evolving role of digital environments in writing practices and offering insights into future development of automated tools for genre-based writing instruction.","PeriodicalId":45632,"journal":{"name":"Journal of Writing Research","volume":" ","pages":""},"PeriodicalIF":4.1,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49088039","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-01DOI: 10.17239/jowr-2020.12.01.06
S. Knight, A. Shibani, S. Abel, A. Gibson, P. Ryan
Written communication is an important skill across academia, the workplace, and civic participation. Effective writing incorporates instantiations of particular text structures - rhetorical moves - that communicate intent to the reader. These rhetorical moves are important across a range of academic styles of writing, including essays and research abstracts, as well as in forms of writing in which one reflects on learning gained through experience. However, learning how to effectively instantiate and use these rhetorical moves is a challenge. Moreover, educators often struggle to provide feedback supporting this learning, particularly at scale. Where effective support is provided, the techniques can be hard to share beyond single implementation sites. We address these challenges through the open-source AcaWriter tool, which provides feedback on rhetorical moves, with a design that allows feedback customization for specific contexts. We introduce three example implementations in which we have customized the tool and evaluated it with regard to user perceptions, and its impact on student writing. We discuss the tool's general theoretical background and provide a detailed technical account. We conclude with four recommendations that emphasize the potential of collaborative approaches in building, sharing and evaluating writing tools in research and practice.
{"title":"AcaWriter: A Learning Analytics Tool for Formative Feedback on Academic Writing","authors":"S. Knight, A. Shibani, S. Abel, A. Gibson, P. Ryan","doi":"10.17239/jowr-2020.12.01.06","DOIUrl":"https://doi.org/10.17239/jowr-2020.12.01.06","url":null,"abstract":"Written communication is an important skill across academia, the workplace, and civic participation. Effective writing incorporates instantiations of particular text structures - rhetorical moves - that communicate intent to the reader. These rhetorical moves are important across a range of academic styles of writing, including essays and research abstracts, as well as in forms of writing in which one reflects on learning gained through experience. However, learning how to effectively instantiate and use these rhetorical moves is a challenge. Moreover, educators often struggle to provide feedback supporting this learning, particularly at scale. Where effective support is provided, the techniques can be hard to share beyond single implementation sites. We address these challenges through the open-source AcaWriter tool, which provides feedback on rhetorical moves, with a design that allows feedback customization for specific contexts. We introduce three example implementations in which we have customized the tool and evaluated it with regard to user perceptions, and its impact on student writing. We discuss the tool's general theoretical background and provide a detailed technical account. We conclude with four recommendations that emphasize the potential of collaborative approaches in building, sharing and evaluating writing tools in research and practice.","PeriodicalId":45632,"journal":{"name":"Journal of Writing Research","volume":"54 2","pages":""},"PeriodicalIF":4.1,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138509443","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-01DOI: 10.17239/jowr-2020.12.01.04
C. Palermo, Joshua Wilson
There is increasing evidence that automated writing evaluation (AWE) systems support the teaching and learning of writing in meaningful ways. However, a dearth of research has explored ways that AWE may be integrated within different instructional contexts and examined the associated effects on students’ writing performance. This paper describes the AWE system MI Write and presents results of a mixed-methods study that investigated the integration and implementation of AWE with writing instruction at the middle-school level, examining AWE integration within both a traditional process approach to writing instruction and with strategy instruction based on the Self-Regulated Strategy Development model. Both instructional contexts were evaluated with respect to fostering growth in students’ first-draft writing quality across successive essays as well as students’ and teachers’ experiences and perceptions of teaching and learning with AWE. Multilevel model analyses indicated that during an eight-week intervention students in both instructional contexts exhibited growth in first-draft writing performance and at comparable rates. Qualitative analyses of interview data revealed that AWE’s influence on instruction was similar across contexts; specifically, the introduction of AWE resulted in both instructional contexts taking on characteristics consistent with a framework for deliberate practice.
{"title":"Implementing Automated Writing Evaluation in Different Instructional Contexts: A Mixed-Methods Study","authors":"C. Palermo, Joshua Wilson","doi":"10.17239/jowr-2020.12.01.04","DOIUrl":"https://doi.org/10.17239/jowr-2020.12.01.04","url":null,"abstract":"There is increasing evidence that automated writing evaluation (AWE) systems support the teaching and learning of writing in meaningful ways. However, a dearth of research has explored ways that AWE may be integrated within different instructional contexts and examined the associated effects on students’ writing performance. This paper describes the AWE system MI Write and presents results of a mixed-methods study that investigated the integration and implementation of AWE with writing instruction at the middle-school level, examining AWE integration within both a traditional process approach to writing instruction and with strategy instruction based on the Self-Regulated Strategy Development model. Both instructional contexts were evaluated with respect to fostering growth in students’ first-draft writing quality across successive essays as well as students’ and teachers’ experiences and perceptions of teaching and learning with AWE. Multilevel model analyses indicated that during an eight-week intervention students in both instructional contexts exhibited growth in first-draft writing performance and at comparable rates. Qualitative analyses of interview data revealed that AWE’s influence on instruction was similar across contexts; specifically, the introduction of AWE resulted in both instructional contexts taking on characteristics consistent with a framework for deliberate practice.","PeriodicalId":45632,"journal":{"name":"Journal of Writing Research","volume":"1 1","pages":""},"PeriodicalIF":4.1,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41862012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-01DOI: 10.17239/jowr-2020.12.01.09
Kalliopi Benetos, Mireille Bétrancourt
{"title":"'Digital authoring support for argumentative writing: what does it change?","authors":"Kalliopi Benetos, Mireille Bétrancourt","doi":"10.17239/jowr-2020.12.01.09","DOIUrl":"https://doi.org/10.17239/jowr-2020.12.01.09","url":null,"abstract":"","PeriodicalId":45632,"journal":{"name":"Journal of Writing Research","volume":" ","pages":""},"PeriodicalIF":4.1,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44359189","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}