{"title":"Rater behaviors in peer evaluation: Patterns and early detection with learner model","authors":"Changhao Liang, Izumi Horikoshi, Rwitajit Majumdar, Hiroaki Ogata","doi":"10.58459/rptel.2025.20012","DOIUrl":null,"url":null,"abstract":"Peer evaluation is a common practice in team-based learning (TBL) designs, which can cover the assessment of individual or group work. However, the integrity of peer evaluation can be compromised by unserious raters—individuals who do not earnestly engage in the evaluation process. These raters may exhibit behaviors like consistently assigning the same score, rushing through evaluations, or evaluating before or long after the target presentations. This study delves into the issue of unserious peer evaluation in group presentations, with a specific focus on understanding the behavior patterns in the digital system. Utilizing evaluation behavior analysis (EBA) indicators, we identify patterns linked to unserious raters during the peer evaluation process. Meanwhile, we also connect these patterns to rating consistency and actual course performance, underscoring the significance of behavior patterns. Further, we conduct a preliminary analysis to explore the application of learner model data available before the peer evaluation starts for the early detection of unserious raters. This finding can assist teachers in providing personalized prompts and interventions before the peer evaluation stage, hence enhancing the evaluation quality through targeted interventions in a timely manner.","PeriodicalId":37055,"journal":{"name":"Research and Practice in Technology Enhanced Learning","volume":null,"pages":null},"PeriodicalIF":3.1000,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research and Practice in Technology Enhanced Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.58459/rptel.2025.20012","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
Peer evaluation is a common practice in team-based learning (TBL) designs, which can cover the assessment of individual or group work. However, the integrity of peer evaluation can be compromised by unserious raters—individuals who do not earnestly engage in the evaluation process. These raters may exhibit behaviors like consistently assigning the same score, rushing through evaluations, or evaluating before or long after the target presentations. This study delves into the issue of unserious peer evaluation in group presentations, with a specific focus on understanding the behavior patterns in the digital system. Utilizing evaluation behavior analysis (EBA) indicators, we identify patterns linked to unserious raters during the peer evaluation process. Meanwhile, we also connect these patterns to rating consistency and actual course performance, underscoring the significance of behavior patterns. Further, we conduct a preliminary analysis to explore the application of learner model data available before the peer evaluation starts for the early detection of unserious raters. This finding can assist teachers in providing personalized prompts and interventions before the peer evaluation stage, hence enhancing the evaluation quality through targeted interventions in a timely manner.