Writing is integral to educational success at all levels and to success in the workplace. However, low literacy is a global challenge, and many students lack sufficient skills to be good writers. With the rapid advance of technology, computer-based tools that provide automated feedback are being increasingly developed. However, mixed results have been obtained so far on the effects of writing feedback, and little is known about how various types of feedback are used by users who access feedback with different goals.
Leveraging both process and response data, the present study investigated students' interactions with feedback in Writing Mentor (WM), an app designed to support academic writing through automated feedback and NLP techniques.
We identified 1857 documents submitted by middle- and high-school students among users-in-the-wild and explored the activities that students engaged in while using WM and the quality of their submitted written products. Students' interaction processes and behaviours were compared across groups of students with different goals for using WM. In addition, sequential pattern mining was applied to identify interesting sequential patterns of student interactions with the writing feedback features that are differentially frequent across goal groups.
Results indicated that students with different goals for using WM displayed different patterns of interaction. Furthermore, the effects of the various types of feedback on writing quality vary by students' purposes for using the tool. Findings from this study shed light on the design of computer-based writing tools and automated writing feedback for learners of various needs.