Recurrent Neural Network-FitNets: Improving Early Prediction of Student Performanceby Time-Series Knowledge Distillation

IF 4 2区 教育学 Q1 EDUCATION & EDUCATIONAL RESEARCH Journal of Educational Computing Research Pub Date : 2022-10-26 DOI:10.1177/07356331221129765
Ryusuke Murata, Fumiya Okubo, T. Minematsu, Yuta Taniguchi, Atsushi Shimada
{"title":"Recurrent Neural Network-FitNets: Improving Early Prediction of Student Performanceby Time-Series Knowledge Distillation","authors":"Ryusuke Murata, Fumiya Okubo, T. Minematsu, Yuta Taniguchi, Atsushi Shimada","doi":"10.1177/07356331221129765","DOIUrl":null,"url":null,"abstract":"This study helps improve the early prediction of student performance by RNN-FitNets, which applies knowledge distillation (KD) to the time series direction of the recurrent neural network (RNN) model. The RNN-FitNets replaces the teacher model in KD with “an RNN model with a long-term time-series in which the features during the entire course are inputted” and the student model in KD with “an RNN model with a short-term time-series in which only the features during the early stages are inputted.” As a result, the RNN model in the early stage was trained to output the same results as the more accurate RNN model in the later stages. The experiment compared RNN-FitNets with a normal RNN model on a dataset of 296 university students in total. The results showed that RNN-FitNets can improve early prediction. Moreover, the SHAP value was employed to explain the contribution of the input features to the prediction results by RNN-FitNets. It was shown that RNN-FitNets can consider the future effects of the input features from the early stages of the course.","PeriodicalId":47865,"journal":{"name":"Journal of Educational Computing Research","volume":"61 1","pages":"639 - 670"},"PeriodicalIF":4.0000,"publicationDate":"2022-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational Computing Research","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1177/07356331221129765","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 2

Abstract

This study helps improve the early prediction of student performance by RNN-FitNets, which applies knowledge distillation (KD) to the time series direction of the recurrent neural network (RNN) model. The RNN-FitNets replaces the teacher model in KD with “an RNN model with a long-term time-series in which the features during the entire course are inputted” and the student model in KD with “an RNN model with a short-term time-series in which only the features during the early stages are inputted.” As a result, the RNN model in the early stage was trained to output the same results as the more accurate RNN model in the later stages. The experiment compared RNN-FitNets with a normal RNN model on a dataset of 296 university students in total. The results showed that RNN-FitNets can improve early prediction. Moreover, the SHAP value was employed to explain the contribution of the input features to the prediction results by RNN-FitNets. It was shown that RNN-FitNets can consider the future effects of the input features from the early stages of the course.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
递归神经网络- fitnets:通过时间序列知识蒸馏改善学生成绩的早期预测
本研究有助于提高RNN- fitnets对学生成绩的早期预测,该方法将知识蒸馏(KD)应用于递归神经网络(RNN)模型的时间序列方向。RNN- fitnets将KD中的教师模型替换为“具有长期时间序列的RNN模型,其中输入整个课程的特征”,将KD中的学生模型替换为“具有短期时间序列的RNN模型,其中仅输入早期阶段的特征”。因此,早期阶段的RNN模型被训练成与后期阶段更精确的RNN模型输出相同的结果。实验将RNN- fitnets与普通RNN模型在总共296名大学生的数据集上进行了比较。结果表明,RNN-FitNets可以提高早期预测。此外,利用SHAP值解释了输入特征对RNN-FitNets预测结果的贡献。结果表明,RNN-FitNets可以从课程的早期阶段就考虑到输入特征的未来影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Educational Computing Research
Journal of Educational Computing Research EDUCATION & EDUCATIONAL RESEARCH-
CiteScore
11.90
自引率
6.20%
发文量
69
期刊介绍: The goal of this Journal is to provide an international scholarly publication forum for peer-reviewed interdisciplinary research into the applications, effects, and implications of computer-based education. The Journal features articles useful for practitioners and theorists alike. The terms "education" and "computing" are viewed broadly. “Education” refers to the use of computer-based technologies at all levels of the formal education system, business and industry, home-schooling, lifelong learning, and unintentional learning environments. “Computing” refers to all forms of computer applications and innovations - both hardware and software. For example, this could range from mobile and ubiquitous computing to immersive 3D simulations and games to computing-enhanced virtual learning environments.
期刊最新文献
Promoting Letter-Naming and Initial-Phoneme Detection Abilities Among Preschoolers at Risk for Specific Learning Disorder Using Technological Intervention With Two Types of Mats: With and Without Target Letter Forms Investigating the Effects of Artificial Intelligence-Assisted Language Learning Strategies on Cognitive Load and Learning Outcomes: A Comparative Study Curiosity, Interest, and Engagement: Unpacking Their Roles in Students’ Learning within a Virtual Game Environment Does Generative Artificial Intelligence Improve the Academic Achievement of College Students? A Meta-Analysis Designing an Inclusive Artificial Intelligence (AI) Curriculum for Elementary Students to Address Gender Differences With Collaborative and Tangible Approaches
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1