Anderson Pinheiro Cavalcanti, A. Diego, R. F. Mello, Katerina Mangaroska, André C. A. Nascimento, F. Freitas, D. Gašević
Feedback is a crucial element in helping students identify gaps and assess their learning progress. In online courses, feedback becomes even more critical as it is one of the resources where the teacher interacts directly with the student. However, with the growing number of students enrolled in online learning, it becomes a challenge for instructors to provide good quality feedback that helps the student self-regulate. In this context, this paper proposed a content analysis of feedback text provided by instructors based on different indicators of good feedback. A random forest classifier was trained and evaluated at different feedback levels. The results achieved outcomes up to 87% and 0.39 of accuracy and Cohen's κ, respectively. The paper also provides insights into the most influential textual features of feedback that predict feedback quality.
{"title":"How good is my feedback?: a content analysis of written feedback","authors":"Anderson Pinheiro Cavalcanti, A. Diego, R. F. Mello, Katerina Mangaroska, André C. A. Nascimento, F. Freitas, D. Gašević","doi":"10.1145/3375462.3375477","DOIUrl":"https://doi.org/10.1145/3375462.3375477","url":null,"abstract":"Feedback is a crucial element in helping students identify gaps and assess their learning progress. In online courses, feedback becomes even more critical as it is one of the resources where the teacher interacts directly with the student. However, with the growing number of students enrolled in online learning, it becomes a challenge for instructors to provide good quality feedback that helps the student self-regulate. In this context, this paper proposed a content analysis of feedback text provided by instructors based on different indicators of good feedback. A random forest classifier was trained and evaluated at different feedback levels. The results achieved outcomes up to 87% and 0.39 of accuracy and Cohen's κ, respectively. The paper also provides insights into the most influential textual features of feedback that predict feedback quality.","PeriodicalId":355800,"journal":{"name":"Proceedings of the Tenth International Conference on Learning Analytics & Knowledge","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133796957","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rita Prestigiacomo, R. Hadgraft, J. Hunter, Lori Lockyer, Simon Knight, E. V. D. Hoven, Roberto Martínez Maldonado
Teachers are increasingly being encouraged to embrace evidence-based practices. Learning analytics (LA) offer great promise in supporting these by providing evidence for teachers and learners to make informed decisions and transform the educational experience. However, LA limitations and their uptake by educators are coming under critical scrutiny. This is in part due to the lack of involvement of teachers and learners in the design of LA tools. In this paper, we propose a human-centred approach to generate understanding of teachers' data needs through the lens of three key principles of translucence: visibility, awareness and accountability. We illustrate our approach through a participatory design sprint to identify how teachers talk about classroom data. We describe teachers' perspectives on the evidence they need for making better-informed decisions and discuss the implications of our approach for the design of human-centred LA in the next years.
{"title":"Learning-centred translucence: an approach to understand how teachers talk about classroom data","authors":"Rita Prestigiacomo, R. Hadgraft, J. Hunter, Lori Lockyer, Simon Knight, E. V. D. Hoven, Roberto Martínez Maldonado","doi":"10.1145/3375462.3375475","DOIUrl":"https://doi.org/10.1145/3375462.3375475","url":null,"abstract":"Teachers are increasingly being encouraged to embrace evidence-based practices. Learning analytics (LA) offer great promise in supporting these by providing evidence for teachers and learners to make informed decisions and transform the educational experience. However, LA limitations and their uptake by educators are coming under critical scrutiny. This is in part due to the lack of involvement of teachers and learners in the design of LA tools. In this paper, we propose a human-centred approach to generate understanding of teachers' data needs through the lens of three key principles of translucence: visibility, awareness and accountability. We illustrate our approach through a participatory design sprint to identify how teachers talk about classroom data. We describe teachers' perspectives on the evidence they need for making better-informed decisions and discuss the implications of our approach for the design of human-centred LA in the next years.","PeriodicalId":355800,"journal":{"name":"Proceedings of the Tenth International Conference on Learning Analytics & Knowledge","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124871453","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hana Vrzakova, M. J. Amon, Angela E. B. Stewart, Nicholas D. Duran, S. D’Mello
Collaborative problem solving (CPS) in virtual environments is an increasingly important context of 21st century learning. However, our understanding of this complex and dynamic phenomenon is still limited. Here, we examine unimodal primitives (activity on the screen, speech, and body movements), and their multimodal combinations during remote CPS. We analyze two datasets where 116 triads collaboratively engaged in a challenging visual programming task using video conferencing software. We investigate how UI-interactions, behavioral primitives, and multimodal patterns were associated with teams' subjective and objective performance outcomes. We found that idling with limited speech (i.e., silence or backchannel feedback only) and without movement was negatively correlated with task performance and with participants' subjective perceptions of the collaboration. However, being silent and focused during solution execution was positively correlated with task performance. Results illustrate that in some cases, multimodal patterns improved the predictions and improved explanatory power over the unimodal primitives. We discuss how the findings can inform the design of real-time interventions for remote CPS.
{"title":"Focused or stuck together: multimodal patterns reveal triads' performance in collaborative problem solving","authors":"Hana Vrzakova, M. J. Amon, Angela E. B. Stewart, Nicholas D. Duran, S. D’Mello","doi":"10.1145/3375462.3375467","DOIUrl":"https://doi.org/10.1145/3375462.3375467","url":null,"abstract":"Collaborative problem solving (CPS) in virtual environments is an increasingly important context of 21st century learning. However, our understanding of this complex and dynamic phenomenon is still limited. Here, we examine unimodal primitives (activity on the screen, speech, and body movements), and their multimodal combinations during remote CPS. We analyze two datasets where 116 triads collaboratively engaged in a challenging visual programming task using video conferencing software. We investigate how UI-interactions, behavioral primitives, and multimodal patterns were associated with teams' subjective and objective performance outcomes. We found that idling with limited speech (i.e., silence or backchannel feedback only) and without movement was negatively correlated with task performance and with participants' subjective perceptions of the collaboration. However, being silent and focused during solution execution was positively correlated with task performance. Results illustrate that in some cases, multimodal patterns improved the predictions and improved explanatory power over the unimodal primitives. We discuss how the findings can inform the design of real-time interventions for remote CPS.","PeriodicalId":355800,"journal":{"name":"Proceedings of the Tenth International Conference on Learning Analytics & Knowledge","volume":"107 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127669119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kirsty Kitto, Nikhil Sarathy, Aleksandr Gromov, Ming Liu, Katarzyna Musial, S. B. Shum
In an era that will increasingly depend upon lifelong learning, the LA community will need to facilitate the movement and sharing of data and information across institutional and geographic boundaries. This will help us to recognise prior learning (RPL) and to personalise the learner experience. Here, we explore the utility of skills-based curriculum analytics and how it might facilitate the process of awarding RPL between two institutions. We explore the potential utility of combining natural language processing and skills taxonomies to map between subject descriptions for these two different institutions, presenting two algorithms we have developed to facilitate RPL and evaluating their performance. We draw attention to some of the issues that arise, listing areas that we consider ripe for future work in a surprisingly underexplored area.
{"title":"Towards skills-based curriculum analytics: can we automate the recognition of prior learning?","authors":"Kirsty Kitto, Nikhil Sarathy, Aleksandr Gromov, Ming Liu, Katarzyna Musial, S. B. Shum","doi":"10.1145/3375462.3375526","DOIUrl":"https://doi.org/10.1145/3375462.3375526","url":null,"abstract":"In an era that will increasingly depend upon lifelong learning, the LA community will need to facilitate the movement and sharing of data and information across institutional and geographic boundaries. This will help us to recognise prior learning (RPL) and to personalise the learner experience. Here, we explore the utility of skills-based curriculum analytics and how it might facilitate the process of awarding RPL between two institutions. We explore the potential utility of combining natural language processing and skills taxonomies to map between subject descriptions for these two different institutions, presenting two algorithms we have developed to facilitate RPL and evaluating their performance. We draw attention to some of the issues that arise, listing areas that we consider ripe for future work in a surprisingly underexplored area.","PeriodicalId":355800,"journal":{"name":"Proceedings of the Tenth International Conference on Learning Analytics & Knowledge","volume":"222 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114405480","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper describes the design and evaluation of personalized visualizations to support young learners' Self-Regulated Learning (SRL) in Adaptive Learning Technologies (ALTs). Our learning path app combines three Personalized Visualizations (PV) that are designed as an external reference to support learners' internal regulation process. The personalized visualizations are based on three pillars: grounding in SRL theory, the usage of trace data and the provision of clear actionable recommendations for learners to improve regulation. This quasi-experimental pre-posttest study finds that learners in the personalized visualization condition improved the regulation of their practice behavior, as indicated by higher accuracy and less complex moment-by-moment learning curves compared to learners in the control group. Learners in the PV condition showed better transfer on learning. Finally, students in the personalized visualizations condition were more likely to under-estimate instead of over-estimate their performance. Overall, these findings indicates that the personalized visualizations improved regulation of practice behavior, transfer of learning and changed the bias in relative monitoring accuracy.
{"title":"Personalized visualizations to promote young learners' SRL: the learning path app","authors":"I. Molenaar, A. Horvers, R. Dijkstra, R. Baker","doi":"10.1145/3375462.3375465","DOIUrl":"https://doi.org/10.1145/3375462.3375465","url":null,"abstract":"This paper describes the design and evaluation of personalized visualizations to support young learners' Self-Regulated Learning (SRL) in Adaptive Learning Technologies (ALTs). Our learning path app combines three Personalized Visualizations (PV) that are designed as an external reference to support learners' internal regulation process. The personalized visualizations are based on three pillars: grounding in SRL theory, the usage of trace data and the provision of clear actionable recommendations for learners to improve regulation. This quasi-experimental pre-posttest study finds that learners in the personalized visualization condition improved the regulation of their practice behavior, as indicated by higher accuracy and less complex moment-by-moment learning curves compared to learners in the control group. Learners in the PV condition showed better transfer on learning. Finally, students in the personalized visualizations condition were more likely to under-estimate instead of over-estimate their performance. Overall, these findings indicates that the personalized visualizations improved regulation of practice behavior, transfer of learning and changed the bias in relative monitoring accuracy.","PeriodicalId":355800,"journal":{"name":"Proceedings of the Tenth International Conference on Learning Analytics & Knowledge","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123838672","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Massive Open Online Courses (MOOCs), delivered on platforms such as edX and Coursera, have led to a surge in large-scale learning research. MOOC platforms gather a continuous stream of learner traces, which can amount to several Gigabytes per MOOC, that learning analytics researchers use to conduct exploratory analyses as well as to evaluate deployed interventions. edX has proven to be a popular platform for such experiments, as the data each MOOC generates is easily accessible to the institution running the MOOC. One of the issues researchers face is the preprocessing, cleaning and formatting of those large-scale learner traces. It is a tedious process that requires considerable computational skills. To reduce this burden, a number of tools have been proposed and released with the aim of simplifying this process. Those tools though still have a significant setup cost, are already out-of-date or require already preprocessed data as a starting point. In contrast, in this paper we introduce ELAT, the edX Log file Analysis Tool, which is browser-based (i.e., no setup costs), keeps the data local (i.e., no server is necessary and the privacy-sensitive learner data is not send anywhere) and takes edX data dumps as input. ELAT does not only process the raw data, but also generates semantically meaningful units (learner sessions instead of just click events) that are visualized in various ways (learning paths, forum participation, video watching sequences). We report on two evaluations we conducted: (i) a technological evaluation and a (ii) user study with potential end users of ELAT. ELAT is open-source and available at https://mvallet91.github.io/ELAT/.
{"title":"edX log data analysis made easy: introducing ELAT: An open-source, privacy-aware and browser-based edX log data analysis tool","authors":"Manuel Valle Torre, Esther Tan, C. Hauff","doi":"10.1145/3375462.3375510","DOIUrl":"https://doi.org/10.1145/3375462.3375510","url":null,"abstract":"Massive Open Online Courses (MOOCs), delivered on platforms such as edX and Coursera, have led to a surge in large-scale learning research. MOOC platforms gather a continuous stream of learner traces, which can amount to several Gigabytes per MOOC, that learning analytics researchers use to conduct exploratory analyses as well as to evaluate deployed interventions. edX has proven to be a popular platform for such experiments, as the data each MOOC generates is easily accessible to the institution running the MOOC. One of the issues researchers face is the preprocessing, cleaning and formatting of those large-scale learner traces. It is a tedious process that requires considerable computational skills. To reduce this burden, a number of tools have been proposed and released with the aim of simplifying this process. Those tools though still have a significant setup cost, are already out-of-date or require already preprocessed data as a starting point. In contrast, in this paper we introduce ELAT, the edX Log file Analysis Tool, which is browser-based (i.e., no setup costs), keeps the data local (i.e., no server is necessary and the privacy-sensitive learner data is not send anywhere) and takes edX data dumps as input. ELAT does not only process the raw data, but also generates semantically meaningful units (learner sessions instead of just click events) that are visualized in various ways (learning paths, forum participation, video watching sequences). We report on two evaluations we conducted: (i) a technological evaluation and a (ii) user study with potential end users of ELAT. ELAT is open-source and available at https://mvallet91.github.io/ELAT/.","PeriodicalId":355800,"journal":{"name":"Proceedings of the Tenth International Conference on Learning Analytics & Knowledge","volume":"520 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120881683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Korah J. Wiley, Y. Dimitriadis, Alison Bradford, Marcia C. Linn
The effectiveness of using learning analytics for learning design primarily depends upon two concepts: grounding and alignment. This is the primary conjecture for the study described in this paper. In our design-based research study, we design, test, and evaluate teacher-facing learning analytics for an online inquiry science unit on global climate change. We design our learning analytics in accordance with a socioconstructivism-based pedagogical framework, called Knowledge Integration, and the principles of learning analytics Implementation Design. Our methodology for the design process draws upon the principle of the Orchestrating for Learning Analytics framework to engage stakeholders (i.e. teachers, researchers, and developers). The resulting learning analytics were aligned to unit activities that engaged students in key aspects of the knowledge integration process. They provided teachers with actionable insight into their students' understanding at critical junctures in the learning process. We demonstrate the efficacy of the learning analytics in supporting the optimization of the unit's learning design. We conclude by synthesizing the principles that guided our design process into a framework for developing and evaluating learning analytics for learning design.
{"title":"From theory to action: developing and evaluating learning analytics for learning design","authors":"Korah J. Wiley, Y. Dimitriadis, Alison Bradford, Marcia C. Linn","doi":"10.1145/3375462.3375540","DOIUrl":"https://doi.org/10.1145/3375462.3375540","url":null,"abstract":"The effectiveness of using learning analytics for learning design primarily depends upon two concepts: grounding and alignment. This is the primary conjecture for the study described in this paper. In our design-based research study, we design, test, and evaluate teacher-facing learning analytics for an online inquiry science unit on global climate change. We design our learning analytics in accordance with a socioconstructivism-based pedagogical framework, called Knowledge Integration, and the principles of learning analytics Implementation Design. Our methodology for the design process draws upon the principle of the Orchestrating for Learning Analytics framework to engage stakeholders (i.e. teachers, researchers, and developers). The resulting learning analytics were aligned to unit activities that engaged students in key aspects of the knowledge integration process. They provided teachers with actionable insight into their students' understanding at critical junctures in the learning process. We demonstrate the efficacy of the learning analytics in supporting the optimization of the unit's learning design. We conclude by synthesizing the principles that guided our design process into a framework for developing and evaluating learning analytics for learning design.","PeriodicalId":355800,"journal":{"name":"Proceedings of the Tenth International Conference on Learning Analytics & Knowledge","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126701737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Gökhan Akçapınar, Mei-Rong Alice Chen, Rwitajit Majumdar, B. Flanagan, H. Ogata
In this paper, we aim to explore students' study approaches (e.g., deep, strategic, surface) from the logs collected by an electronic textbook (eBook) system. Data was collected from 89 students related to their reading activities both in and out of the class in a Freshman English course. Students are given a task to study reading materials through the eBook system, highlight the text that is related to the main or supporting ideas, and answer the questions prepared for measuring their level of comprehension. Students in and out of class reading times and their usage of the marker feature were used as a proxy to understand their study approaches. We used theory-driven and data-driven approaches together to model the study approaches of students. Our results showed that three groups of students who have different study approaches could be identified. Relationships between students' reading behaviors and their academic performance is also investigated by using association rule mining analysis. Obtained results are discussed in terms of monitoring, feedback, predicting learning outcomes, and identifying problems with the content design.
{"title":"Exploring student approaches to learning through sequence analysis of reading logs","authors":"Gökhan Akçapınar, Mei-Rong Alice Chen, Rwitajit Majumdar, B. Flanagan, H. Ogata","doi":"10.1145/3375462.3375492","DOIUrl":"https://doi.org/10.1145/3375462.3375492","url":null,"abstract":"In this paper, we aim to explore students' study approaches (e.g., deep, strategic, surface) from the logs collected by an electronic textbook (eBook) system. Data was collected from 89 students related to their reading activities both in and out of the class in a Freshman English course. Students are given a task to study reading materials through the eBook system, highlight the text that is related to the main or supporting ideas, and answer the questions prepared for measuring their level of comprehension. Students in and out of class reading times and their usage of the marker feature were used as a proxy to understand their study approaches. We used theory-driven and data-driven approaches together to model the study approaches of students. Our results showed that three groups of students who have different study approaches could be identified. Relationships between students' reading behaviors and their academic performance is also investigated by using association rule mining analysis. Obtained results are discussed in terms of monitoring, feedback, predicting learning outcomes, and identifying problems with the content design.","PeriodicalId":355800,"journal":{"name":"Proceedings of the Tenth International Conference on Learning Analytics & Knowledge","volume":"15 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114136242","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Additive Factors Model is a widely used student model, which is primarily used for refining knowledge component models (Q-matrices). We explore the robustness and generalizability of the model. We explicitly formulate simplifying assumptions that the model makes and we discuss methods for visualizing learning curves based on the model. We also report on an application of the model to data from a learning system for introductory programming; these experiments illustrate possibly misleading interpretation of model results due to differences in item difficulty. Overall, our results show that greater care has to be taken in the application of the model and in the interpretation of results obtained with the model.
{"title":"Exploration of the robustness and generalizability of the additive factors model","authors":"Tomáš Effenberger, Radek Pelánek, Jaroslav Čechák","doi":"10.1145/3375462.3375491","DOIUrl":"https://doi.org/10.1145/3375462.3375491","url":null,"abstract":"Additive Factors Model is a widely used student model, which is primarily used for refining knowledge component models (Q-matrices). We explore the robustness and generalizability of the model. We explicitly formulate simplifying assumptions that the model makes and we discuss methods for visualizing learning curves based on the model. We also report on an application of the model to data from a learning system for introductory programming; these experiments illustrate possibly misleading interpretation of model results due to differences in item difficulty. Overall, our results show that greater care has to be taken in the application of the model and in the interpretation of results obtained with the model.","PeriodicalId":355800,"journal":{"name":"Proceedings of the Tenth International Conference on Learning Analytics & Knowledge","volume":"13 15","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132546607","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yingbin Zhang, L. Paquette, R. Baker, Jaclyn L. Ocumpaugh, Nigel Bosch, Anabil Munshi, G. Biswas
Confusion has been shown to be prevalent during complex learning and has mixed effects on learning. Whether confusion facilitates or hampers learning may depend on whether it is resolved or not. Confusion resolution, behind which is the resolution of cognitive disequilibrium, requires learners to possess some skills, but it is unclear what these skills are. One possibility may be metacognitive strategies (MS), strategies for regulating cognition. This study examined the relationship between confusion and actions related to MS in Betty's Brain, a computer-based learning environment. The results revealed that MS behavior differed during and outside confusion. However, confusion resolution was not related to MS behavior, and MS did not moderate the effect of confusion on learning.
{"title":"The relationship between confusion and metacognitive strategies in Betty's Brain","authors":"Yingbin Zhang, L. Paquette, R. Baker, Jaclyn L. Ocumpaugh, Nigel Bosch, Anabil Munshi, G. Biswas","doi":"10.1145/3375462.3375518","DOIUrl":"https://doi.org/10.1145/3375462.3375518","url":null,"abstract":"Confusion has been shown to be prevalent during complex learning and has mixed effects on learning. Whether confusion facilitates or hampers learning may depend on whether it is resolved or not. Confusion resolution, behind which is the resolution of cognitive disequilibrium, requires learners to possess some skills, but it is unclear what these skills are. One possibility may be metacognitive strategies (MS), strategies for regulating cognition. This study examined the relationship between confusion and actions related to MS in Betty's Brain, a computer-based learning environment. The results revealed that MS behavior differed during and outside confusion. However, confusion resolution was not related to MS behavior, and MS did not moderate the effect of confusion on learning.","PeriodicalId":355800,"journal":{"name":"Proceedings of the Tenth International Conference on Learning Analytics & Knowledge","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115704029","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}