{"title":"Student-Facing Learning Analytics Dashboard for Remote Lab Practical Work","authors":"David P. Reid;Timothy D. Drysdale","doi":"10.1109/TLT.2024.3354128","DOIUrl":null,"url":null,"abstract":"The designs of many student-facing learning analytics (SFLA) dashboards are insufficiently informed by educational research and lack rigorous evaluation in authentic learning contexts, including during remote laboratory practical work. In this article, we present and evaluate an SFLA dashboard designed using the principles of formative assessment to provide feedback to students during remote lab activities. Feedback is based upon graphical visualizations of student actions performed during lab tasks and comparison to expected procedures using TaskCompare—our custom, asymmetric graph dissimilarity measure that distinguishes students who miss expected actions from those who perform additional actions, a capability missing in existing graph distance (symmetrical dissimilarity) measures. Using a total of \n<inline-formula><tex-math>$N = 235$</tex-math></inline-formula>\n student graphs collected during authentic learning in two different engineering courses, we describe the validation of TaskCompare and evaluate the impact of the SFLA dashboard on task completion during remote lab activities. In addition, we use components of the motivated strategies for learning questionnaire as covariates for propensity score matching to account for potential bias in self-selection of use of the dashboard. We find that those students who used the SFLA dashboard achieved significantly better task completion rate (nearly double) than those who did not, with a significant difference in TaskCompare score between the two groups (Mann–Whitney \n<inline-formula><tex-math>$U = 453.5$</tex-math></inline-formula>\n, \n<inline-formula><tex-math>$p < 0.01$</tex-math></inline-formula>\n and Cliff's \n<inline-formula><tex-math>$\\delta = 0.43$</tex-math></inline-formula>\n, large effect size). This difference remains after accounting for self-selection. We also report that students' positive rating of the usefulness of the SFLA dashboard for completing lab work is significantly above a neutral response (\n<inline-formula><tex-math>$S = 21.0$</tex-math></inline-formula>\n and \n<inline-formula><tex-math>$p < 0.01$</tex-math></inline-formula>\n). These findings provide evidence that our SFLA dashboard is an effective means of providing formative assessment during remote laboratory activities.","PeriodicalId":49191,"journal":{"name":"IEEE Transactions on Learning Technologies","volume":"17 ","pages":"1037-1050"},"PeriodicalIF":2.9000,"publicationDate":"2024-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Learning Technologies","FirstCategoryId":"95","ListUrlMain":"https://ieeexplore.ieee.org/document/10399863/","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
The designs of many student-facing learning analytics (SFLA) dashboards are insufficiently informed by educational research and lack rigorous evaluation in authentic learning contexts, including during remote laboratory practical work. In this article, we present and evaluate an SFLA dashboard designed using the principles of formative assessment to provide feedback to students during remote lab activities. Feedback is based upon graphical visualizations of student actions performed during lab tasks and comparison to expected procedures using TaskCompare—our custom, asymmetric graph dissimilarity measure that distinguishes students who miss expected actions from those who perform additional actions, a capability missing in existing graph distance (symmetrical dissimilarity) measures. Using a total of
$N = 235$
student graphs collected during authentic learning in two different engineering courses, we describe the validation of TaskCompare and evaluate the impact of the SFLA dashboard on task completion during remote lab activities. In addition, we use components of the motivated strategies for learning questionnaire as covariates for propensity score matching to account for potential bias in self-selection of use of the dashboard. We find that those students who used the SFLA dashboard achieved significantly better task completion rate (nearly double) than those who did not, with a significant difference in TaskCompare score between the two groups (Mann–Whitney
$U = 453.5$
,
$p < 0.01$
and Cliff's
$\delta = 0.43$
, large effect size). This difference remains after accounting for self-selection. We also report that students' positive rating of the usefulness of the SFLA dashboard for completing lab work is significantly above a neutral response (
$S = 21.0$
and
$p < 0.01$
). These findings provide evidence that our SFLA dashboard is an effective means of providing formative assessment during remote laboratory activities.
期刊介绍:
The IEEE Transactions on Learning Technologies covers all advances in learning technologies and their applications, including but not limited to the following topics: innovative online learning systems; intelligent tutors; educational games; simulation systems for education and training; collaborative learning tools; learning with mobile devices; wearable devices and interfaces for learning; personalized and adaptive learning systems; tools for formative and summative assessment; tools for learning analytics and educational data mining; ontologies for learning systems; standards and web services that support learning; authoring tools for learning materials; computer support for peer tutoring; learning via computer-mediated inquiry, field, and lab work; social learning techniques; social networks and infrastructures for learning and knowledge sharing; and creation and management of learning objects.