Xinjie Sun , Qi Liu , Kai Zhang , Shuanghong Shen , Yan Zhuang , Yuxiang Guo
{"title":"LGS-KT: Integrating logical and grammatical skills for effective programming knowledge tracing","authors":"Xinjie Sun , Qi Liu , Kai Zhang , Shuanghong Shen , Yan Zhuang , Yuxiang Guo","doi":"10.1016/j.neunet.2025.107164","DOIUrl":null,"url":null,"abstract":"<div><div>Knowledge tracing (KT) estimates students’ mastery of knowledge concepts or skills by analyzing their historical interactions. Although general KT methods have effectively assessed students’ knowledge states, specific measurements of students’ programming skills remain insufficient. Existing studies mainly rely on exercise outcomes and do not fully utilize behavioral data during the programming process. Therefore, we integrate a <em><strong>L</strong>ogical and <strong>G</strong>rammar <strong>S</strong>kills <strong>K</strong>nowledge <strong>T</strong>racing (<strong>LGS-KT</strong>)</em> model to enhance programming education. This model integrates static analysis and dynamic monitoring (such as CPU and memory consumption) to evaluate code elements, providing a thorough assessment of code quality. By analyzing students’ multiple iterations on the same programming problem, we constructed a reweighted logical skill evolution graph to assess the development of students’ logical skills. Additionally, to enhance the interactions among representations with similar grammatical skills, we developed a grammatical skills interaction graph based on the similarity of knowledge concepts. This approach significantly improves the accuracy of inferring students’ programming grammatical skill states. The LGS-KT model has demonstrated superior performance in predicting student outcomes. Our research highlights the potential application of a KT model that integrates logical and grammatical skills in programming exercises. To support reproducible research, we have published the data and code at <span><span>https://github.com/xinjiesun-ustc/LGS-KT</span><svg><path></path></svg></span>, encouraging further innovation in this field.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"185 ","pages":"Article 107164"},"PeriodicalIF":6.0000,"publicationDate":"2025-01-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025000437","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Knowledge tracing (KT) estimates students’ mastery of knowledge concepts or skills by analyzing their historical interactions. Although general KT methods have effectively assessed students’ knowledge states, specific measurements of students’ programming skills remain insufficient. Existing studies mainly rely on exercise outcomes and do not fully utilize behavioral data during the programming process. Therefore, we integrate a Logical and Grammar Skills Knowledge Tracing (LGS-KT) model to enhance programming education. This model integrates static analysis and dynamic monitoring (such as CPU and memory consumption) to evaluate code elements, providing a thorough assessment of code quality. By analyzing students’ multiple iterations on the same programming problem, we constructed a reweighted logical skill evolution graph to assess the development of students’ logical skills. Additionally, to enhance the interactions among representations with similar grammatical skills, we developed a grammatical skills interaction graph based on the similarity of knowledge concepts. This approach significantly improves the accuracy of inferring students’ programming grammatical skill states. The LGS-KT model has demonstrated superior performance in predicting student outcomes. Our research highlights the potential application of a KT model that integrates logical and grammatical skills in programming exercises. To support reproducible research, we have published the data and code at https://github.com/xinjiesun-ustc/LGS-KT, encouraging further innovation in this field.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.