Maarten van der Velde, Florian Sense, Jelmer P. Borst, Hedderik van Rijn
{"title":"大规模评估适应性事实学习中的冷启动缓解措施:知道 \"什么 \"比知道 \"谁 \"更重要","authors":"Maarten van der Velde, Florian Sense, Jelmer P. Borst, Hedderik van Rijn","doi":"10.1007/s11257-024-09401-5","DOIUrl":null,"url":null,"abstract":"<p>Adaptive learning systems offer a personalised digital environment that continually adjusts to the learner and the material, with the goal of maximising learning gains. Whenever such a system encounters a new learner, or when a returning learner starts studying new material, the system first has to determine the difficulty of the material for that specific learner. Failing to address this “cold-start” problem leads to suboptimal learning and potential disengagement from the system, as the system may present problems of an inappropriate difficulty or provide unhelpful feedback. In a simulation study conducted on a large educational data set from an adaptive fact learning system (about 100 million trials from almost 140 thousand learners), we predicted individual learning parameters from response data. Using these predicted parameters as starting estimates for the adaptive learning system yielded a more accurate model of learners’ memory performance than using default values. We found that predictions based on the difficulty of the fact (“what”) generally outperformed predictions based on the ability of the learner (“who”), though both contributed to better model estimates. This work extends a previous smaller-scale laboratory-based experiment in which using fact-specific predictions in a cold-start scenario improved learning outcomes. The current findings suggest that similar cold-start alleviation may be possible in real-world educational settings. The improved predictions can be harnessed to increase the efficiency of the learning system, mitigate the negative effects of a cold start, and potentially improve learning outcomes.</p>","PeriodicalId":49388,"journal":{"name":"User Modeling and User-Adapted Interaction","volume":"22 1","pages":""},"PeriodicalIF":3.0000,"publicationDate":"2024-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Large-scale evaluation of cold-start mitigation in adaptive fact learning: Knowing “what” matters more than knowing “who”\",\"authors\":\"Maarten van der Velde, Florian Sense, Jelmer P. Borst, Hedderik van Rijn\",\"doi\":\"10.1007/s11257-024-09401-5\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Adaptive learning systems offer a personalised digital environment that continually adjusts to the learner and the material, with the goal of maximising learning gains. Whenever such a system encounters a new learner, or when a returning learner starts studying new material, the system first has to determine the difficulty of the material for that specific learner. Failing to address this “cold-start” problem leads to suboptimal learning and potential disengagement from the system, as the system may present problems of an inappropriate difficulty or provide unhelpful feedback. In a simulation study conducted on a large educational data set from an adaptive fact learning system (about 100 million trials from almost 140 thousand learners), we predicted individual learning parameters from response data. Using these predicted parameters as starting estimates for the adaptive learning system yielded a more accurate model of learners’ memory performance than using default values. We found that predictions based on the difficulty of the fact (“what”) generally outperformed predictions based on the ability of the learner (“who”), though both contributed to better model estimates. This work extends a previous smaller-scale laboratory-based experiment in which using fact-specific predictions in a cold-start scenario improved learning outcomes. The current findings suggest that similar cold-start alleviation may be possible in real-world educational settings. The improved predictions can be harnessed to increase the efficiency of the learning system, mitigate the negative effects of a cold start, and potentially improve learning outcomes.</p>\",\"PeriodicalId\":49388,\"journal\":{\"name\":\"User Modeling and User-Adapted Interaction\",\"volume\":\"22 1\",\"pages\":\"\"},\"PeriodicalIF\":3.0000,\"publicationDate\":\"2024-06-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"User Modeling and User-Adapted Interaction\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s11257-024-09401-5\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, CYBERNETICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"User Modeling and User-Adapted Interaction","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s11257-024-09401-5","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
Large-scale evaluation of cold-start mitigation in adaptive fact learning: Knowing “what” matters more than knowing “who”
Adaptive learning systems offer a personalised digital environment that continually adjusts to the learner and the material, with the goal of maximising learning gains. Whenever such a system encounters a new learner, or when a returning learner starts studying new material, the system first has to determine the difficulty of the material for that specific learner. Failing to address this “cold-start” problem leads to suboptimal learning and potential disengagement from the system, as the system may present problems of an inappropriate difficulty or provide unhelpful feedback. In a simulation study conducted on a large educational data set from an adaptive fact learning system (about 100 million trials from almost 140 thousand learners), we predicted individual learning parameters from response data. Using these predicted parameters as starting estimates for the adaptive learning system yielded a more accurate model of learners’ memory performance than using default values. We found that predictions based on the difficulty of the fact (“what”) generally outperformed predictions based on the ability of the learner (“who”), though both contributed to better model estimates. This work extends a previous smaller-scale laboratory-based experiment in which using fact-specific predictions in a cold-start scenario improved learning outcomes. The current findings suggest that similar cold-start alleviation may be possible in real-world educational settings. The improved predictions can be harnessed to increase the efficiency of the learning system, mitigate the negative effects of a cold start, and potentially improve learning outcomes.
期刊介绍:
User Modeling and User-Adapted Interaction provides an interdisciplinary forum for the dissemination of novel and significant original research results about interactive computer systems that can adapt themselves to their users, and on the design, use, and evaluation of user models for adaptation. The journal publishes high-quality original papers from, e.g., the following areas: acquisition and formal representation of user models; conceptual models and user stereotypes for personalization; student modeling and adaptive learning; models of groups of users; user model driven personalised information discovery and retrieval; recommender systems; adaptive user interfaces and agents; adaptation for accessibility and inclusion; generic user modeling systems and tools; interoperability of user models; personalization in areas such as; affective computing; ubiquitous and mobile computing; language based interactions; multi-modal interactions; virtual and augmented reality; social media and the Web; human-robot interaction; behaviour change interventions; personalized applications in specific domains; privacy, accountability, and security of information for personalization; responsible adaptation: fairness, accountability, explainability, transparency and control; methods for the design and evaluation of user models and adaptive systems