C. Hundhausen, Phill Conrad, A. S. Carter, Olusola O. Adesope
{"title":"Assessing individual contributions to software engineering projects: a replication study","authors":"C. Hundhausen, Phill Conrad, A. S. Carter, Olusola O. Adesope","doi":"10.1080/08993408.2022.2071543","DOIUrl":null,"url":null,"abstract":"ABSTRACT Background and Context Assessing team members’ indivdiual contributions to software development projects poses a key problem for computing instructors. While instructors typically rely on subjective assessments, objective assessments could provide a more robust picture. To explore this possibility, In a 2020 paper, Buffardi presented a correlational analysis of objective metrics and subjective metrics in an advanced software engineering project course (n= 41 students and 10 teams), finding only two significant correlations. Objective To explore the robustness of Buffardi’s findings and gain further insight, we conducted a larger scale replication of the Buffardi study (n = 118 students and 25 teams) in three courses at three institutions. Method We collected the same data as in the Buffardi study and computed the same measures from those data. We replicated Buffardi’s exploratory, correlational and regression analyses of objective and subjective measures. Findings While replicating four of Buffardi’s five significant correlational findings and partially replicating the findings of Buffardi’s regression analyses, our results go beyond those of Buffardi by identifying eight additional significant correlations. Implications In contrast to Buffardi’s study, our larger scale study suggests that subjective and objective measures of individual performance in team software development projects can be fruitfully combined to provide consistent and complementary assessments of individual performance.","PeriodicalId":45844,"journal":{"name":"Computer Science Education","volume":"32 1","pages":"335 - 354"},"PeriodicalIF":3.0000,"publicationDate":"2022-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Science Education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/08993408.2022.2071543","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 2
Abstract
ABSTRACT Background and Context Assessing team members’ indivdiual contributions to software development projects poses a key problem for computing instructors. While instructors typically rely on subjective assessments, objective assessments could provide a more robust picture. To explore this possibility, In a 2020 paper, Buffardi presented a correlational analysis of objective metrics and subjective metrics in an advanced software engineering project course (n= 41 students and 10 teams), finding only two significant correlations. Objective To explore the robustness of Buffardi’s findings and gain further insight, we conducted a larger scale replication of the Buffardi study (n = 118 students and 25 teams) in three courses at three institutions. Method We collected the same data as in the Buffardi study and computed the same measures from those data. We replicated Buffardi’s exploratory, correlational and regression analyses of objective and subjective measures. Findings While replicating four of Buffardi’s five significant correlational findings and partially replicating the findings of Buffardi’s regression analyses, our results go beyond those of Buffardi by identifying eight additional significant correlations. Implications In contrast to Buffardi’s study, our larger scale study suggests that subjective and objective measures of individual performance in team software development projects can be fruitfully combined to provide consistent and complementary assessments of individual performance.
期刊介绍:
Computer Science Education publishes high-quality papers with a specific focus on teaching and learning within the computing discipline. The journal seeks novel contributions that are accessible and of interest to researchers and practitioners alike. We invite work with learners of all ages and across both classroom and out-of-classroom learning contexts.