Andrew B Speer, Angie Y Delacruz, Lauren J Wegmeyer, James Perrotta
{"title":"直接主管绩效评级的评估者间可靠性的元分析估计:最优测量设计下的乐观主义。","authors":"Andrew B Speer, Angie Y Delacruz, Lauren J Wegmeyer, James Perrotta","doi":"10.1037/apl0001146","DOIUrl":null,"url":null,"abstract":"<p><p>Performance appraisal (PA) is used for various organizational purposes and is vital to human resources practices. Despite this, current estimates of PA reliability are low, leading to decades of criticism regarding the use of PA in organizational contexts. In this article, we argue that current meta-analytical interrater reliability (IRR) coefficients are underestimates and do not reflect the reliability of interest to most practitioners and researchers-the reliability of an employee's direct supervisor. To establish the reliability of direct supervisor ratings, those making PA ratings must directly supervise employee job performance instead of nonparallel rater designs (e.g., direct supervisor ratings correlated with ratings from a more senior leader). The current meta-analysis identified 22 independent samples that met this more restrictive study inclusion criterion, finding an average observed IRR of .65. We also report reliability estimates for several important contextual moderators, including whether ratings were completed in operational settings (.60) or for research purposes (.67). In sum, we argue that this study's meta-analytical IRR estimates are the best available estimates of direct supervisor reliability and should be used to guide future research and practice. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>","PeriodicalId":15135,"journal":{"name":"Journal of Applied Psychology","volume":" ","pages":"456-467"},"PeriodicalIF":9.4000,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Meta-analytical estimates of interrater reliability for direct supervisor performance ratings: Optimism under optimal measurement designs.\",\"authors\":\"Andrew B Speer, Angie Y Delacruz, Lauren J Wegmeyer, James Perrotta\",\"doi\":\"10.1037/apl0001146\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Performance appraisal (PA) is used for various organizational purposes and is vital to human resources practices. Despite this, current estimates of PA reliability are low, leading to decades of criticism regarding the use of PA in organizational contexts. In this article, we argue that current meta-analytical interrater reliability (IRR) coefficients are underestimates and do not reflect the reliability of interest to most practitioners and researchers-the reliability of an employee's direct supervisor. To establish the reliability of direct supervisor ratings, those making PA ratings must directly supervise employee job performance instead of nonparallel rater designs (e.g., direct supervisor ratings correlated with ratings from a more senior leader). The current meta-analysis identified 22 independent samples that met this more restrictive study inclusion criterion, finding an average observed IRR of .65. We also report reliability estimates for several important contextual moderators, including whether ratings were completed in operational settings (.60) or for research purposes (.67). In sum, we argue that this study's meta-analytical IRR estimates are the best available estimates of direct supervisor reliability and should be used to guide future research and practice. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>\",\"PeriodicalId\":15135,\"journal\":{\"name\":\"Journal of Applied Psychology\",\"volume\":\" \",\"pages\":\"456-467\"},\"PeriodicalIF\":9.4000,\"publicationDate\":\"2024-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Applied Psychology\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1037/apl0001146\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2023/10/12 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"MANAGEMENT\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/apl0001146","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/10/12 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"MANAGEMENT","Score":null,"Total":0}
Meta-analytical estimates of interrater reliability for direct supervisor performance ratings: Optimism under optimal measurement designs.
Performance appraisal (PA) is used for various organizational purposes and is vital to human resources practices. Despite this, current estimates of PA reliability are low, leading to decades of criticism regarding the use of PA in organizational contexts. In this article, we argue that current meta-analytical interrater reliability (IRR) coefficients are underestimates and do not reflect the reliability of interest to most practitioners and researchers-the reliability of an employee's direct supervisor. To establish the reliability of direct supervisor ratings, those making PA ratings must directly supervise employee job performance instead of nonparallel rater designs (e.g., direct supervisor ratings correlated with ratings from a more senior leader). The current meta-analysis identified 22 independent samples that met this more restrictive study inclusion criterion, finding an average observed IRR of .65. We also report reliability estimates for several important contextual moderators, including whether ratings were completed in operational settings (.60) or for research purposes (.67). In sum, we argue that this study's meta-analytical IRR estimates are the best available estimates of direct supervisor reliability and should be used to guide future research and practice. (PsycInfo Database Record (c) 2024 APA, all rights reserved).
期刊介绍:
The Journal of Applied Psychology® focuses on publishing original investigations that contribute new knowledge and understanding to fields of applied psychology (excluding clinical and applied experimental or human factors, which are better suited for other APA journals). The journal primarily considers empirical and theoretical investigations that enhance understanding of cognitive, motivational, affective, and behavioral psychological phenomena in work and organizational settings. These phenomena can occur at individual, group, organizational, or cultural levels, and in various work settings such as business, education, training, health, service, government, or military institutions. The journal welcomes submissions from both public and private sector organizations, for-profit or nonprofit. It publishes several types of articles, including:
1.Rigorously conducted empirical investigations that expand conceptual understanding (original investigations or meta-analyses).
2.Theory development articles and integrative conceptual reviews that synthesize literature and generate new theories on psychological phenomena to stimulate novel research.
3.Rigorously conducted qualitative research on phenomena that are challenging to capture with quantitative methods or require inductive theory building.