{"title":"Rasch模型扩展在mooc中增强形成性评估","authors":"D. Abbakumov, P. Desmet, W. Van den Noortgate","doi":"10.1080/08957347.2020.1732382","DOIUrl":null,"url":null,"abstract":"ABSTRACT Formative assessments are an important component of massive open online courses (MOOCs), online courses with open access and unlimited student participation. Accurate conclusions on students’ proficiency via formative, however, face several challenges: (a) students are typically allowed to make several attempts; and (b) student performance might be affected by other variables, such as interest. Thus, neglecting the effects of attempts and interest in proficiency evaluation might result in biased conclusions. In this study, we try to solve this limitation and propose two extensions of the common psychometric model, the Rasch model, by including the effects of attempts and interest. We illustrate these extensions using real MOOC data and evaluate them using cross-validation. We found that (a) the effects of attempts and interest on the performance are positive on average but both vary among students; (b) a part of the variance in proficiency parameters is due to variation between students in the effect of interest; and (c) the overall accuracy of prediction of student’s item responses using the extensions is 4.3% higher than using the Rasch model.","PeriodicalId":51609,"journal":{"name":"Applied Measurement in Education","volume":"33 1","pages":"113 - 123"},"PeriodicalIF":1.1000,"publicationDate":"2020-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/08957347.2020.1732382","citationCount":"4","resultStr":"{\"title\":\"Rasch Model Extensions for Enhanced Formative Assessments in MOOCs\",\"authors\":\"D. Abbakumov, P. Desmet, W. Van den Noortgate\",\"doi\":\"10.1080/08957347.2020.1732382\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT Formative assessments are an important component of massive open online courses (MOOCs), online courses with open access and unlimited student participation. Accurate conclusions on students’ proficiency via formative, however, face several challenges: (a) students are typically allowed to make several attempts; and (b) student performance might be affected by other variables, such as interest. Thus, neglecting the effects of attempts and interest in proficiency evaluation might result in biased conclusions. In this study, we try to solve this limitation and propose two extensions of the common psychometric model, the Rasch model, by including the effects of attempts and interest. We illustrate these extensions using real MOOC data and evaluate them using cross-validation. We found that (a) the effects of attempts and interest on the performance are positive on average but both vary among students; (b) a part of the variance in proficiency parameters is due to variation between students in the effect of interest; and (c) the overall accuracy of prediction of student’s item responses using the extensions is 4.3% higher than using the Rasch model.\",\"PeriodicalId\":51609,\"journal\":{\"name\":\"Applied Measurement in Education\",\"volume\":\"33 1\",\"pages\":\"113 - 123\"},\"PeriodicalIF\":1.1000,\"publicationDate\":\"2020-03-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1080/08957347.2020.1732382\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Measurement in Education\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1080/08957347.2020.1732382\",\"RegionNum\":4,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Measurement in Education","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1080/08957347.2020.1732382","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
Rasch Model Extensions for Enhanced Formative Assessments in MOOCs
ABSTRACT Formative assessments are an important component of massive open online courses (MOOCs), online courses with open access and unlimited student participation. Accurate conclusions on students’ proficiency via formative, however, face several challenges: (a) students are typically allowed to make several attempts; and (b) student performance might be affected by other variables, such as interest. Thus, neglecting the effects of attempts and interest in proficiency evaluation might result in biased conclusions. In this study, we try to solve this limitation and propose two extensions of the common psychometric model, the Rasch model, by including the effects of attempts and interest. We illustrate these extensions using real MOOC data and evaluate them using cross-validation. We found that (a) the effects of attempts and interest on the performance are positive on average but both vary among students; (b) a part of the variance in proficiency parameters is due to variation between students in the effect of interest; and (c) the overall accuracy of prediction of student’s item responses using the extensions is 4.3% higher than using the Rasch model.
期刊介绍:
Because interaction between the domains of research and application is critical to the evaluation and improvement of new educational measurement practices, Applied Measurement in Education" prime objective is to improve communication between academicians and practitioners. To help bridge the gap between theory and practice, articles in this journal describe original research studies, innovative strategies for solving educational measurement problems, and integrative reviews of current approaches to contemporary measurement issues. Peer Review Policy: All review papers in this journal have undergone editorial screening and peer review.