Rebecca L. Edwards, Sarah K. Davis, A. Hadwin, Todd M. Milford
Prior research offers evidence that differing levels of student engagement are associated with different outcomes in terms of performance. In this study, we investigating the efficacy of a model of behavioural and agentic engagement to predict student performance (low, middle, high) at four timepoints in a semester. The model was significant at all four timepoints. Measures of behavioural and agentic engagement predicted membership across the three groups differently. With a few exceptions, these differences were consistent across timepoints. Looking at variations in student engagement across time can be used to target interventions to support student success at the undergraduate level.
{"title":"Using predictive analytics in a self-regulated learning university course to promote student success","authors":"Rebecca L. Edwards, Sarah K. Davis, A. Hadwin, Todd M. Milford","doi":"10.1145/3027385.3029455","DOIUrl":"https://doi.org/10.1145/3027385.3029455","url":null,"abstract":"Prior research offers evidence that differing levels of student engagement are associated with different outcomes in terms of performance. In this study, we investigating the efficacy of a model of behavioural and agentic engagement to predict student performance (low, middle, high) at four timepoints in a semester. The model was significant at all four timepoints. Measures of behavioural and agentic engagement predicted membership across the three groups differently. With a few exceptions, these differences were consistent across timepoints. Looking at variations in student engagement across time can be used to target interventions to support student success at the undergraduate level.","PeriodicalId":160897,"journal":{"name":"Proceedings of the Seventh International Learning Analytics & Knowledge Conference","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116210094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The aim of this paper is to show how multimodal learning analytics (MMLA) can help understand how elementary students explore the concept of feedback loops while controlling an embodied simulation of a predator-prey ecosystem using hand movements as an interface with the computer simulation. We represent student motion patterns from fine-grained logs of hands and gaze data, and then map these observed motion patterns against levels of student performance to make inferences about how embodiment plays a role in the learning process. Results show five distinct motion sequences in students' embodied interactions, and these motion patterns are statistically associated with initial and post-tutorial levels of students' understanding of feedback loops. Analysis of student gaze also shows distinctive patterns as to how low- and high-performing students attended to information presented in the simulation. Using MMLA, we show how students' explanations of feedback loops look differently according to cluster membership, which provides evidence that embodiment interacts with conceptual understanding.
{"title":"Understanding student learning trajectories using multimodal learning analytics within an embodied-interaction learning environment","authors":"Alejandro Andrade","doi":"10.1145/3027385.3027429","DOIUrl":"https://doi.org/10.1145/3027385.3027429","url":null,"abstract":"The aim of this paper is to show how multimodal learning analytics (MMLA) can help understand how elementary students explore the concept of feedback loops while controlling an embodied simulation of a predator-prey ecosystem using hand movements as an interface with the computer simulation. We represent student motion patterns from fine-grained logs of hands and gaze data, and then map these observed motion patterns against levels of student performance to make inferences about how embodiment plays a role in the learning process. Results show five distinct motion sequences in students' embodied interactions, and these motion patterns are statistically associated with initial and post-tutorial levels of students' understanding of feedback loops. Analysis of student gaze also shows distinctive patterns as to how low- and high-performing students attended to information presented in the simulation. Using MMLA, we show how students' explanations of feedback loops look differently according to cluster membership, which provides evidence that embodiment interacts with conceptual understanding.","PeriodicalId":160897,"journal":{"name":"Proceedings of the Seventh International Learning Analytics & Knowledge Conference","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116479847","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kimberly E. Arnold, Brandon Karcher, Casey V. Wright, J. McKay
The purpose of this paper is to examine the cross institutional use of a quantified-self application called Pattern, which is designed to promote self-regulation and reflective learning in learners. This paper provides a brief look into how learners report spending their time and react to in-app recommendations. Initial data is encouraging; however, there are limitations of Pattern, and additional research and development must be undertaken.
{"title":"Student empowerment, awareness, and self-regulation through a quantified-self student tool","authors":"Kimberly E. Arnold, Brandon Karcher, Casey V. Wright, J. McKay","doi":"10.1145/3027385.3029434","DOIUrl":"https://doi.org/10.1145/3027385.3029434","url":null,"abstract":"The purpose of this paper is to examine the cross institutional use of a quantified-self application called Pattern, which is designed to promote self-regulation and reflective learning in learners. This paper provides a brief look into how learners report spending their time and react to in-app recommendations. Initial data is encouraging; however, there are limitations of Pattern, and additional research and development must be undertaken.","PeriodicalId":160897,"journal":{"name":"Proceedings of the Seventh International Learning Analytics & Knowledge Conference","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124153705","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Bowe, Weiqin Chen, D. Griffiths, T. Hoel, Jaeho Lee, H. Ogata, Griff Richards, Li Yuan, Jingjing Zhang
The Learning Analytics and Policy (LAP) workshop explores and documents the ways in which policies at national and regional level are shaping the development of learning analytics. It brings together representatives from around the world who report on the circumstances in their own country. The workshop is preceded by an information gathering phase, and followed by the authoring of a report. The aspirations, achievements and constraints in the different countries are contrasted and documented, providing a valuable resource for the future development of learning analytics.
{"title":"Learning analytics and policy (LAP): international aspirations, achievements and constraints","authors":"M. Bowe, Weiqin Chen, D. Griffiths, T. Hoel, Jaeho Lee, H. Ogata, Griff Richards, Li Yuan, Jingjing Zhang","doi":"10.1145/3027385.3029436","DOIUrl":"https://doi.org/10.1145/3027385.3029436","url":null,"abstract":"The Learning Analytics and Policy (LAP) workshop explores and documents the ways in which policies at national and regional level are shaping the development of learning analytics. It brings together representatives from around the world who report on the circumstances in their own country. The workshop is preceded by an information gathering phase, and followed by the authoring of a report. The aspirations, achievements and constraints in the different countries are contrasted and documented, providing a valuable resource for the future development of learning analytics.","PeriodicalId":160897,"journal":{"name":"Proceedings of the Seventh International Learning Analytics & Knowledge Conference","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128063204","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Among all of the learners in Massive Open Online Courses (MOOCs) who intend to complete a course, the majority fail to do so. This intention-action gap is found in many domains of human experience, and research in similar goal pursuit domains suggests that plan-making is a cheap and effective nudge to encourage follow-through. In a natural field experiment in three HarvardX courses, some students received open-ended planning prompts at the beginning of a course. These prompts increased course completion by 29%, and payment for certificates by 40%. This effect was largest for students enrolled in traditional schools. Furthermore, the contents of students' plans could predict which students were least likely to succeed - in particular, students whose plans focused on specific times were unlikely to complete the course. Our results suggest that planning prompts can help learners adopted productive frames of mind at the outset of a learning goal that encourage and forecast student success.
在大规模在线开放课程(Massive Open Online Courses, MOOCs)的所有学习者中,大多数人都没有完成一门课程。这种意图与行动的差距在人类经验的许多领域都存在,在类似的目标追求领域的研究表明,制定计划是一种廉价而有效的推动,可以鼓励人们坚持到底。在哈佛dx的三门课程的自然实地实验中,一些学生在课程开始时收到了开放式的计划提示。这些提示使课程完成率提高了29%,证书费用提高了40%。在传统学校就读的学生中,这种影响最大。此外,学生计划的内容可以预测哪些学生最不可能成功——特别是那些计划集中在特定时间的学生不太可能完成课程。我们的研究结果表明,计划提示可以帮助学习者在学习目标开始时采用富有成效的思维框架,从而鼓励和预测学生的成功。
{"title":"Planning prompts increase and forecast course completion in massive open online courses","authors":"M. Yeomans, J. Reich","doi":"10.1145/3027385.3027416","DOIUrl":"https://doi.org/10.1145/3027385.3027416","url":null,"abstract":"Among all of the learners in Massive Open Online Courses (MOOCs) who intend to complete a course, the majority fail to do so. This intention-action gap is found in many domains of human experience, and research in similar goal pursuit domains suggests that plan-making is a cheap and effective nudge to encourage follow-through. In a natural field experiment in three HarvardX courses, some students received open-ended planning prompts at the beginning of a course. These prompts increased course completion by 29%, and payment for certificates by 40%. This effect was largest for students enrolled in traditional schools. Furthermore, the contents of students' plans could predict which students were least likely to succeed - in particular, students whose plans focused on specific times were unlikely to complete the course. Our results suggest that planning prompts can help learners adopted productive frames of mind at the outset of a learning goal that encourage and forecast student success.","PeriodicalId":160897,"journal":{"name":"Proceedings of the Seventh International Learning Analytics & Knowledge Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132179738","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper explores the potential of using naïve topic modeling to support instructors in navigating MOOC discussion forums. Categorizing discussion threads into topics can provide an overview of the discussion, improve navigation of the forum, and support replying to a representative sample of content related posts. We investigate four different approaches to using topic models to organize and present discussion posts, highlighting the strength and weaknesses of each approach to support instructors.
{"title":"Topic models to support instructors in MOOC forums","authors":"Jovita M. Vytasek, A. Wise, Sonya Woloshen","doi":"10.1145/3027385.3029486","DOIUrl":"https://doi.org/10.1145/3027385.3029486","url":null,"abstract":"This paper explores the potential of using naïve topic modeling to support instructors in navigating MOOC discussion forums. Categorizing discussion threads into topics can provide an overview of the discussion, improve navigation of the forum, and support replying to a representative sample of content related posts. We investigate four different approaches to using topic models to organize and present discussion posts, highlighting the strength and weaknesses of each approach to support instructors.","PeriodicalId":160897,"journal":{"name":"Proceedings of the Seventh International Learning Analytics & Knowledge Conference","volume":"8 17","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132272063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Predicting the decrease of students' engagement in typical MOOC tasks such as watching lecture videos or submitting assignments is key to trigger timely interventions in order to try to avoid the disengagement before it takes place. This paper proposes an approach to build the necessary predictive models using students' data that becomes available during a course. The approach was employed in an experimental study to predict the decrease of three different engagement indicators in a MOOC. The results suggest its feasibility with values of area under the curve for different predictors ranging from 0.718 to 0.914.
{"title":"Predicting the decrease of engagement indicators in a MOOC","authors":"Miguel L. Bote-Lorenzo, E. Gómez-Sánchez","doi":"10.1145/3027385.3027387","DOIUrl":"https://doi.org/10.1145/3027385.3027387","url":null,"abstract":"Predicting the decrease of students' engagement in typical MOOC tasks such as watching lecture videos or submitting assignments is key to trigger timely interventions in order to try to avoid the disengagement before it takes place. This paper proposes an approach to build the necessary predictive models using students' data that becomes available during a course. The approach was employed in an experimental study to predict the decrease of three different engagement indicators in a MOOC. The results suggest its feasibility with values of area under the curve for different predictors ranging from 0.718 to 0.914.","PeriodicalId":160897,"journal":{"name":"Proceedings of the Seventh International Learning Analytics & Knowledge Conference","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129254159","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
There have been numerous efforts to increase students' academic success. One data-driven approach is to highlight the important learning objectives in a course. In this paper, we used visualization and three feature selection methods to highlight the important learning objectives in a course. Identifying important learning objectives as well as the relation among the learning objectives have multiple educational advantages. First, it informs the instructors and students of the important topics in the course; without learning them properly students will not be successful. Second, it highlights any inconsistencies in defining the learning objective, how they are being assessed, and design of the course. Thus, this approach can be used as a course design diagnostic tool.
{"title":"Utilizing visualization and feature selection methods to identify important learning objectives in a course","authors":"Farshid Marbouti, H. Diefes‐Dux, K. Madhavan","doi":"10.1145/3027385.3029450","DOIUrl":"https://doi.org/10.1145/3027385.3029450","url":null,"abstract":"There have been numerous efforts to increase students' academic success. One data-driven approach is to highlight the important learning objectives in a course. In this paper, we used visualization and three feature selection methods to highlight the important learning objectives in a course. Identifying important learning objectives as well as the relation among the learning objectives have multiple educational advantages. First, it informs the instructors and students of the important topics in the course; without learning them properly students will not be successful. Second, it highlights any inconsistencies in defining the learning objective, how they are being assessed, and design of the course. Thus, this approach can be used as a course design diagnostic tool.","PeriodicalId":160897,"journal":{"name":"Proceedings of the Seventh International Learning Analytics & Knowledge Conference","volume":"112 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121358041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As higher education increasingly moves to online and digital learning spaces, we have access not only to greater volumes of student data, but also to increasingly fine-grained and nuanced data. A significant body of research and existing practice are used to convince key stakeholders within higher education of the potential of the collection, analysis and use of student data to positively impact on student experiences in these environments. Much of the recent focus in learning analytics is around predictive modeling and uses of artificial intelligence to both identify learners at risk, and to personalize interventions to increase the chance of success. In this paper we explore the moral and legal basis for the obligation to act on our analyses of student data. The obligation to act entails not only the protection of student privacy and the ethical collection, analysis and use of student data, but also, the effective allocation of resources to ensure appropriate and effective interventions to increase effective teaching and learning. The obligation to act is, however tempered by a number of factors, including inter and intra-departmental operational fragmentation and the constraints imposed by changing funding regimes. Increasingly higher education institutions allocate resources in areas that promise the greatest return. Choosing (not) to respond to the needs of specific student populations then raises questions regarding the scope and nature of the moral and legal obligation to act. There is also evidence that students who are at risk of failing often do not respond to institutional interventions to assist them. In this paper we build and expand on recent research by, for example, the LACE and EP4LA workshops to conceptually map the obligation to act which flows from both higher education's mandate to ensure effective and appropriate teaching and learning and its fiduciary duty to provide an ethical and enabling environment for students to achieve success. We examine how the collection and analysis of student data links to both the availability of resources and the will to act and also to the obligation to act. Further, we examine how that obligation unfolds in two open distance education providers from the perspective of a key set of stakeholders - those in immediate contact with students and their learning journeys - the tutors or adjunct faculty.
{"title":"An elephant in the learning analytics room: the obligation to act","authors":"P. Prinsloo, Sharon Slade","doi":"10.1145/3027385.3027406","DOIUrl":"https://doi.org/10.1145/3027385.3027406","url":null,"abstract":"As higher education increasingly moves to online and digital learning spaces, we have access not only to greater volumes of student data, but also to increasingly fine-grained and nuanced data. A significant body of research and existing practice are used to convince key stakeholders within higher education of the potential of the collection, analysis and use of student data to positively impact on student experiences in these environments. Much of the recent focus in learning analytics is around predictive modeling and uses of artificial intelligence to both identify learners at risk, and to personalize interventions to increase the chance of success. In this paper we explore the moral and legal basis for the obligation to act on our analyses of student data. The obligation to act entails not only the protection of student privacy and the ethical collection, analysis and use of student data, but also, the effective allocation of resources to ensure appropriate and effective interventions to increase effective teaching and learning. The obligation to act is, however tempered by a number of factors, including inter and intra-departmental operational fragmentation and the constraints imposed by changing funding regimes. Increasingly higher education institutions allocate resources in areas that promise the greatest return. Choosing (not) to respond to the needs of specific student populations then raises questions regarding the scope and nature of the moral and legal obligation to act. There is also evidence that students who are at risk of failing often do not respond to institutional interventions to assist them. In this paper we build and expand on recent research by, for example, the LACE and EP4LA workshops to conceptually map the obligation to act which flows from both higher education's mandate to ensure effective and appropriate teaching and learning and its fiduciary duty to provide an ethical and enabling environment for students to achieve success. We examine how the collection and analysis of student data links to both the availability of resources and the will to act and also to the obligation to act. Further, we examine how that obligation unfolds in two open distance education providers from the perspective of a key set of stakeholders - those in immediate contact with students and their learning journeys - the tutors or adjunct faculty.","PeriodicalId":160897,"journal":{"name":"Proceedings of the Seventh International Learning Analytics & Knowledge Conference","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123726313","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the 'file drawer effect', and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project's Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.
{"title":"Where is the evidence?: a call to action for learning analytics","authors":"Rebecca Ferguson, D. Clow","doi":"10.1145/3027385.3027396","DOIUrl":"https://doi.org/10.1145/3027385.3027396","url":null,"abstract":"Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the 'file drawer effect', and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project's Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.","PeriodicalId":160897,"journal":{"name":"Proceedings of the Seventh International Learning Analytics & Knowledge Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123728358","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}