面向学生的远程实验室实践工作学习分析仪表板

IF 2.9 3区 教育学 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS IEEE Transactions on Learning Technologies Pub Date : 2024-01-15 DOI:10.1109/TLT.2024.3354128
David P. Reid;Timothy D. Drysdale
{"title":"面向学生的远程实验室实践工作学习分析仪表板","authors":"David P. Reid;Timothy D. Drysdale","doi":"10.1109/TLT.2024.3354128","DOIUrl":null,"url":null,"abstract":"The designs of many student-facing learning analytics (SFLA) dashboards are insufficiently informed by educational research and lack rigorous evaluation in authentic learning contexts, including during remote laboratory practical work. In this article, we present and evaluate an SFLA dashboard designed using the principles of formative assessment to provide feedback to students during remote lab activities. Feedback is based upon graphical visualizations of student actions performed during lab tasks and comparison to expected procedures using TaskCompare—our custom, asymmetric graph dissimilarity measure that distinguishes students who miss expected actions from those who perform additional actions, a capability missing in existing graph distance (symmetrical dissimilarity) measures. Using a total of \n<inline-formula><tex-math>$N = 235$</tex-math></inline-formula>\n student graphs collected during authentic learning in two different engineering courses, we describe the validation of TaskCompare and evaluate the impact of the SFLA dashboard on task completion during remote lab activities. In addition, we use components of the motivated strategies for learning questionnaire as covariates for propensity score matching to account for potential bias in self-selection of use of the dashboard. We find that those students who used the SFLA dashboard achieved significantly better task completion rate (nearly double) than those who did not, with a significant difference in TaskCompare score between the two groups (Mann–Whitney \n<inline-formula><tex-math>$U = 453.5$</tex-math></inline-formula>\n, \n<inline-formula><tex-math>$p &lt; 0.01$</tex-math></inline-formula>\n and Cliff's \n<inline-formula><tex-math>$\\delta = 0.43$</tex-math></inline-formula>\n, large effect size). This difference remains after accounting for self-selection. We also report that students' positive rating of the usefulness of the SFLA dashboard for completing lab work is significantly above a neutral response (\n<inline-formula><tex-math>$S = 21.0$</tex-math></inline-formula>\n and \n<inline-formula><tex-math>$p &lt; 0.01$</tex-math></inline-formula>\n). These findings provide evidence that our SFLA dashboard is an effective means of providing formative assessment during remote laboratory activities.","PeriodicalId":49191,"journal":{"name":"IEEE Transactions on Learning Technologies","volume":"17 ","pages":"1037-1050"},"PeriodicalIF":2.9000,"publicationDate":"2024-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Student-Facing Learning Analytics Dashboard for Remote Lab Practical Work\",\"authors\":\"David P. Reid;Timothy D. Drysdale\",\"doi\":\"10.1109/TLT.2024.3354128\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The designs of many student-facing learning analytics (SFLA) dashboards are insufficiently informed by educational research and lack rigorous evaluation in authentic learning contexts, including during remote laboratory practical work. In this article, we present and evaluate an SFLA dashboard designed using the principles of formative assessment to provide feedback to students during remote lab activities. Feedback is based upon graphical visualizations of student actions performed during lab tasks and comparison to expected procedures using TaskCompare—our custom, asymmetric graph dissimilarity measure that distinguishes students who miss expected actions from those who perform additional actions, a capability missing in existing graph distance (symmetrical dissimilarity) measures. Using a total of \\n<inline-formula><tex-math>$N = 235$</tex-math></inline-formula>\\n student graphs collected during authentic learning in two different engineering courses, we describe the validation of TaskCompare and evaluate the impact of the SFLA dashboard on task completion during remote lab activities. In addition, we use components of the motivated strategies for learning questionnaire as covariates for propensity score matching to account for potential bias in self-selection of use of the dashboard. We find that those students who used the SFLA dashboard achieved significantly better task completion rate (nearly double) than those who did not, with a significant difference in TaskCompare score between the two groups (Mann–Whitney \\n<inline-formula><tex-math>$U = 453.5$</tex-math></inline-formula>\\n, \\n<inline-formula><tex-math>$p &lt; 0.01$</tex-math></inline-formula>\\n and Cliff's \\n<inline-formula><tex-math>$\\\\delta = 0.43$</tex-math></inline-formula>\\n, large effect size). This difference remains after accounting for self-selection. We also report that students' positive rating of the usefulness of the SFLA dashboard for completing lab work is significantly above a neutral response (\\n<inline-formula><tex-math>$S = 21.0$</tex-math></inline-formula>\\n and \\n<inline-formula><tex-math>$p &lt; 0.01$</tex-math></inline-formula>\\n). These findings provide evidence that our SFLA dashboard is an effective means of providing formative assessment during remote laboratory activities.\",\"PeriodicalId\":49191,\"journal\":{\"name\":\"IEEE Transactions on Learning Technologies\",\"volume\":\"17 \",\"pages\":\"1037-1050\"},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2024-01-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Learning Technologies\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10399863/\",\"RegionNum\":3,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Learning Technologies","FirstCategoryId":"95","ListUrlMain":"https://ieeexplore.ieee.org/document/10399863/","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

许多面向学生的学习分析(SFLA)仪表板的设计都没有充分考虑教育研究,也缺乏在真实学习情境(包括远程实验室实践工作)中的严格评估。在本文中,我们介绍并评估了一种利用形成性评估原理设计的 SFLA 面板,它能在远程实验活动中为学生提供反馈。反馈基于学生在实验任务中执行的操作的图形可视化,以及使用 TaskCompare 与预期程序的比较,TaskCompare 是我们定制的非对称图形差异度量,可将未执行预期操作的学生与执行额外操作的学生区分开来,这是现有图形距离(对称差异度)度量所缺少的功能。利用在两门不同工程课程的真实学习过程中收集到的总计 $N = 235$ 的学生图,我们描述了 TaskCompare 的验证情况,并评估了 SFLA 面板对远程实验活动中任务完成情况的影响。此外,我们还将学习动机策略问卷中的成分作为倾向得分匹配的协变量,以考虑使用仪表板时自我选择的潜在偏差。我们发现,使用SFLA仪表板的学生的任务完成率明显高于未使用的学生(几乎翻了一番),两组学生的TaskCompare得分存在显著差异(Mann-Whitney $U = 453.5$,$p < 0.01$,Cliff's $\delta = 0.43$,效应大小较大)。在考虑自我选择因素后,这一差异依然存在。我们还报告说,学生对SFLA仪表板对完成实验作业的有用性的积极评价明显高于中性反应($S = 21.0$和$p < 0.01$)。这些发现证明,我们的 SFLA 面板是在远程实验活动中提供形成性评估的有效手段。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Student-Facing Learning Analytics Dashboard for Remote Lab Practical Work
The designs of many student-facing learning analytics (SFLA) dashboards are insufficiently informed by educational research and lack rigorous evaluation in authentic learning contexts, including during remote laboratory practical work. In this article, we present and evaluate an SFLA dashboard designed using the principles of formative assessment to provide feedback to students during remote lab activities. Feedback is based upon graphical visualizations of student actions performed during lab tasks and comparison to expected procedures using TaskCompare—our custom, asymmetric graph dissimilarity measure that distinguishes students who miss expected actions from those who perform additional actions, a capability missing in existing graph distance (symmetrical dissimilarity) measures. Using a total of $N = 235$ student graphs collected during authentic learning in two different engineering courses, we describe the validation of TaskCompare and evaluate the impact of the SFLA dashboard on task completion during remote lab activities. In addition, we use components of the motivated strategies for learning questionnaire as covariates for propensity score matching to account for potential bias in self-selection of use of the dashboard. We find that those students who used the SFLA dashboard achieved significantly better task completion rate (nearly double) than those who did not, with a significant difference in TaskCompare score between the two groups (Mann–Whitney $U = 453.5$ , $p < 0.01$ and Cliff's $\delta = 0.43$ , large effect size). This difference remains after accounting for self-selection. We also report that students' positive rating of the usefulness of the SFLA dashboard for completing lab work is significantly above a neutral response ( $S = 21.0$ and $p < 0.01$ ). These findings provide evidence that our SFLA dashboard is an effective means of providing formative assessment during remote laboratory activities.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Learning Technologies
IEEE Transactions on Learning Technologies COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS-
CiteScore
7.50
自引率
5.40%
发文量
82
审稿时长
>12 weeks
期刊介绍: The IEEE Transactions on Learning Technologies covers all advances in learning technologies and their applications, including but not limited to the following topics: innovative online learning systems; intelligent tutors; educational games; simulation systems for education and training; collaborative learning tools; learning with mobile devices; wearable devices and interfaces for learning; personalized and adaptive learning systems; tools for formative and summative assessment; tools for learning analytics and educational data mining; ontologies for learning systems; standards and web services that support learning; authoring tools for learning materials; computer support for peer tutoring; learning via computer-mediated inquiry, field, and lab work; social learning techniques; social networks and infrastructures for learning and knowledge sharing; and creation and management of learning objects.
期刊最新文献
Empowering Instructors: Augmented Reality Authoring Toolkit for Aviation Weather Education Guest Editorial Intelligence Augmentation: The Owl of Athena Designing Learning Technologies: Assessing Attention in Children With Autism Through a Single Case Study Investigating the Efficacy of ChatGPT-3.5 for Tutoring in Chinese Elementary Education Settings Impact of Gamified Learning Experience on Online Learning Effectiveness
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1