{"title":"The intersections between DORA, open research and equity","authors":"Stephen Curry","doi":"10.2218/eorc.2022.7051","DOIUrl":null,"url":null,"abstract":"The San Francisco Declaration on Research Assessment (DORA) is a campaigning initiative to improve the ways that we evaluate research and researchers. It aims particularly to help people understand the problems associated with over-reliance on aggregate metrics like the journal impact factor or H-index in assessment processes. Such metrics have enduring appeal because they appear to offer the simplicity and objectivity of numerical analyses. However, we need to be mindful of the subjective nature of decisions that lead to citation counts – the raw material of many performance metrics – and the biases that perturb them. The challenge now, if we are to move to more robust and equitable forms of evaluation, is to ensure that these are as effective and as efficient as possible. Embracing this challenge will also help to clear the way for more open and impactful science, and for a more inclusive academy.","PeriodicalId":244254,"journal":{"name":"Edinburgh Open Research","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Edinburgh Open Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2218/eorc.2022.7051","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The San Francisco Declaration on Research Assessment (DORA) is a campaigning initiative to improve the ways that we evaluate research and researchers. It aims particularly to help people understand the problems associated with over-reliance on aggregate metrics like the journal impact factor or H-index in assessment processes. Such metrics have enduring appeal because they appear to offer the simplicity and objectivity of numerical analyses. However, we need to be mindful of the subjective nature of decisions that lead to citation counts – the raw material of many performance metrics – and the biases that perturb them. The challenge now, if we are to move to more robust and equitable forms of evaluation, is to ensure that these are as effective and as efficient as possible. Embracing this challenge will also help to clear the way for more open and impactful science, and for a more inclusive academy.