{"title":"Modelling student behavior in algorithm simulation exercises with code mutation","authors":"O. Seppälä","doi":"10.1145/1315803.1315822","DOIUrl":null,"url":null,"abstract":"Visual algorithm simulation exercises test student knowledge of different algorithms by making them trace the steps of how a given algorithm would have manipulated a set of input data. When assessing such exercises the main difference between a human assessor and an automated assessment procedure is the human ability to adapt to the possible errors made by the student. A human assessor can continue past the point where the model solution and the student solution deviate and make a hypothesis on the source of the error based on the student's answer. Our goal is to bring some of that ability to automated assessment. We anticipate that providing better feedback on student errors might help reduce persistent misconceptions.\n The method described tries to automatically recreate erroneous student behavior by introducing a set of code mutations on the original algorithm code. The available mutations correspond to different careless errors and misconceptions held by the student.\n The results show that such automatically generated \"misconceived\" algorithms can explain much of the student behavior found in erroneous solutions to the exercise. Non-systematic mutations can also be used to simulate slips which greatly reduces the number of erroneous solutions without explanations.","PeriodicalId":135065,"journal":{"name":"Baltic Sea '06","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Baltic Sea '06","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1315803.1315822","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Visual algorithm simulation exercises test student knowledge of different algorithms by making them trace the steps of how a given algorithm would have manipulated a set of input data. When assessing such exercises the main difference between a human assessor and an automated assessment procedure is the human ability to adapt to the possible errors made by the student. A human assessor can continue past the point where the model solution and the student solution deviate and make a hypothesis on the source of the error based on the student's answer. Our goal is to bring some of that ability to automated assessment. We anticipate that providing better feedback on student errors might help reduce persistent misconceptions.
The method described tries to automatically recreate erroneous student behavior by introducing a set of code mutations on the original algorithm code. The available mutations correspond to different careless errors and misconceptions held by the student.
The results show that such automatically generated "misconceived" algorithms can explain much of the student behavior found in erroneous solutions to the exercise. Non-systematic mutations can also be used to simulate slips which greatly reduces the number of erroneous solutions without explanations.