{"title":"Knowledge Tracing Through Enhanced Questions and Directed Learning Interaction Based on Multigraph Embeddings in Intelligent Tutoring Systems","authors":"Liqing Qiu;Lulu Wang","doi":"10.1109/TE.2024.3448532","DOIUrl":null,"url":null,"abstract":"In recent years, knowledge tracing (KT) within intelligent tutoring systems (ITSs) has seen rapid development. KT aims to assess a student’s knowledge state based on past performance and predict the correctness of the next question. Traditional KT often treats questions with different difficulty levels of the same concept as identical representations, limiting the effectiveness of question embedding. Additionally, higher-order semantic relationships between questions are overlooked. Graph models have been employed in KT to enhance question embedding representation, but they rarely consider the directed relationships between learning interactions. Therefore, this article introduces a novel approach, KT through Enhanced Questions and Directed Learning Interaction Based on multigraph embeddings in ITSs (MGEKT), to address these limitations. One channel enhances question embedding representation by capturing relationships between students, concepts, and questions. This channel defines two meta paths, facilitating the learning of high-order semantic relationships between questions. The other channel constructs a directed graph of learning interactions, leveraging graph attention convolution to illustrate their intricate relationships. A new gating mechanism is proposed to capture long-term dependencies and emphasize critical information when tracing students’ knowledge states. Notably, MGEKT employs reverse knowledge distillation, transferring knowledge from two small models (student models) to a large model (teacher model). This knowledge distillation enhances the model’s generalization performance and improves the perception of crucial information. In comparative evaluations across four datasets, MGEKT outperformed baselines, demonstrating its effectiveness in KT.","PeriodicalId":55011,"journal":{"name":"IEEE Transactions on Education","volume":"68 1","pages":"43-56"},"PeriodicalIF":2.1000,"publicationDate":"2024-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Education","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10666281/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0
Abstract
In recent years, knowledge tracing (KT) within intelligent tutoring systems (ITSs) has seen rapid development. KT aims to assess a student’s knowledge state based on past performance and predict the correctness of the next question. Traditional KT often treats questions with different difficulty levels of the same concept as identical representations, limiting the effectiveness of question embedding. Additionally, higher-order semantic relationships between questions are overlooked. Graph models have been employed in KT to enhance question embedding representation, but they rarely consider the directed relationships between learning interactions. Therefore, this article introduces a novel approach, KT through Enhanced Questions and Directed Learning Interaction Based on multigraph embeddings in ITSs (MGEKT), to address these limitations. One channel enhances question embedding representation by capturing relationships between students, concepts, and questions. This channel defines two meta paths, facilitating the learning of high-order semantic relationships between questions. The other channel constructs a directed graph of learning interactions, leveraging graph attention convolution to illustrate their intricate relationships. A new gating mechanism is proposed to capture long-term dependencies and emphasize critical information when tracing students’ knowledge states. Notably, MGEKT employs reverse knowledge distillation, transferring knowledge from two small models (student models) to a large model (teacher model). This knowledge distillation enhances the model’s generalization performance and improves the perception of crucial information. In comparative evaluations across four datasets, MGEKT outperformed baselines, demonstrating its effectiveness in KT.
期刊介绍:
The IEEE Transactions on Education (ToE) publishes significant and original scholarly contributions to education in electrical and electronics engineering, computer engineering, computer science, and other fields within the scope of interest of IEEE. Contributions must address discovery, integration, and/or application of knowledge in education in these fields. Articles must support contributions and assertions with compelling evidence and provide explicit, transparent descriptions of the processes through which the evidence is collected, analyzed, and interpreted. While characteristics of compelling evidence cannot be described to address every conceivable situation, generally assessment of the work being reported must go beyond student self-report and attitudinal data.