{"title":"Assessment management in large courses","authors":"J. Greenslade","doi":"10.1109/FIE.2009.5350575","DOIUrl":null,"url":null,"abstract":"When large engineering courses assess work marked by several markers, issues of consistency often arise. Data was collected from a first year Mathematical Modeling course of 467 students, where 18 markers were employed to mark the 10 weekly assignments. Control scripts were also marked by the lecturer and compared with the results from each marker. An acceptable level of variability was established by correlating the assignment data with final examination marks for each student group. The resulting thresholds will, in future, be used to quickly identify markers who may need additional training or guidance. Lecturers were also asked to mark a small number of student scripts to act as exemplars for the markers. This approach gave the lecturer a previously unavailable opportunity to trial and modify the marking schedule before it was given to the markers. The results of this study have lead to improvements in the way markers are supported and monitored within the local engineering department, but could also be of relevance to other departments and disciplines. The engagement of lecturers in creating exemplar marked assignments has also proved an excellent opportunity to increase the engagement of teaching staff with the assessment process and hence the student learning experience.","PeriodicalId":129330,"journal":{"name":"2009 39th IEEE Frontiers in Education Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 39th IEEE Frontiers in Education Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FIE.2009.5350575","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
When large engineering courses assess work marked by several markers, issues of consistency often arise. Data was collected from a first year Mathematical Modeling course of 467 students, where 18 markers were employed to mark the 10 weekly assignments. Control scripts were also marked by the lecturer and compared with the results from each marker. An acceptable level of variability was established by correlating the assignment data with final examination marks for each student group. The resulting thresholds will, in future, be used to quickly identify markers who may need additional training or guidance. Lecturers were also asked to mark a small number of student scripts to act as exemplars for the markers. This approach gave the lecturer a previously unavailable opportunity to trial and modify the marking schedule before it was given to the markers. The results of this study have lead to improvements in the way markers are supported and monitored within the local engineering department, but could also be of relevance to other departments and disciplines. The engagement of lecturers in creating exemplar marked assignments has also proved an excellent opportunity to increase the engagement of teaching staff with the assessment process and hence the student learning experience.