{"title":"Clustering Student Programming Assignments to Multiply Instructor Leverage","authors":"Hezheng Yin, J. Moghadam, A. Fox","doi":"10.1145/2724660.2728695","DOIUrl":null,"url":null,"abstract":"A challenge in introductory and intermediate programming courses is understanding how students approached solving a particular programming problem, in order to provide feedback on how they might improve. In both Massive Open Online Courses (MOOCs) and large residential courses, such feedback is difficult to provide for each student individually. To multiply the instructor's leverage, we would like to group student submissions according to the general problem-solving strategy they used, as the first stage of a ``feedback pipeline''. We describe ongoing explorations of a variety of clustering algorithms and similarity metrics using a corpus of over 800 student submissions to a simple programming assignment from a programming MOOC. We find that for a majority of submissions, it is possible to automatically create clusters such that an instructor ``eyeballing'' some representative submissions from each cluster can readily describe qualitatively what the common elements are in student submissions in that cluster. This information can be the basis for feedback to the students or for comparing one group of students' approach with another's.","PeriodicalId":20664,"journal":{"name":"Proceedings of the Second (2015) ACM Conference on Learning @ Scale","volume":"10 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2015-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"21","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Second (2015) ACM Conference on Learning @ Scale","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2724660.2728695","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 21
Abstract
A challenge in introductory and intermediate programming courses is understanding how students approached solving a particular programming problem, in order to provide feedback on how they might improve. In both Massive Open Online Courses (MOOCs) and large residential courses, such feedback is difficult to provide for each student individually. To multiply the instructor's leverage, we would like to group student submissions according to the general problem-solving strategy they used, as the first stage of a ``feedback pipeline''. We describe ongoing explorations of a variety of clustering algorithms and similarity metrics using a corpus of over 800 student submissions to a simple programming assignment from a programming MOOC. We find that for a majority of submissions, it is possible to automatically create clusters such that an instructor ``eyeballing'' some representative submissions from each cluster can readily describe qualitatively what the common elements are in student submissions in that cluster. This information can be the basis for feedback to the students or for comparing one group of students' approach with another's.