Alyssa Shi, Brooke Bier, Carrigan Price, Luke Schwartz, Devan Wainright, Audra Whithaus, Alison Abritis, Ivan Oransky, Misha Angrist
{"title":"收回:关于衡量撤回通知质量的评分标准的试点研究。","authors":"Alyssa Shi, Brooke Bier, Carrigan Price, Luke Schwartz, Devan Wainright, Audra Whithaus, Alison Abritis, Ivan Oransky, Misha Angrist","doi":"10.1080/08989621.2024.2366281","DOIUrl":null,"url":null,"abstract":"<p><p>The frequency of scientific retractions has grown substantially in recent years. However, thus far there is no standardized retraction notice format to which journals and their publishers adhere voluntarily, let alone compulsorily. We developed a rubric specifying seven criteria in order to judge whether retraction notices are easily and freely accessible, informative, and transparent. We mined the Retraction Watch database and evaluated a total of 768 retraction notices from two publishers (Springer and Wiley) over three years (2010, 2015, and 2020). Per our rubric, both publishers tended to score higher on measures of openness/availability, accessibility, and clarity as to why a paper was retracted than they did in: acknowledging institutional investigations; confirming whether there was consensus among authors; and specifying which parts of any given paper warranted retraction. Springer retraction notices appeared to improve over time with respect to the rubric's seven criteria. We observed some discrepancies among raters, indicating the difficulty in developing a robust objective rubric for evaluating retraction notices.</p>","PeriodicalId":50927,"journal":{"name":"Accountability in Research-Policies and Quality Assurance","volume":" ","pages":"1-12"},"PeriodicalIF":2.8000,"publicationDate":"2024-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Taking it back: A pilot study of a rubric measuring retraction notice quality.\",\"authors\":\"Alyssa Shi, Brooke Bier, Carrigan Price, Luke Schwartz, Devan Wainright, Audra Whithaus, Alison Abritis, Ivan Oransky, Misha Angrist\",\"doi\":\"10.1080/08989621.2024.2366281\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The frequency of scientific retractions has grown substantially in recent years. However, thus far there is no standardized retraction notice format to which journals and their publishers adhere voluntarily, let alone compulsorily. We developed a rubric specifying seven criteria in order to judge whether retraction notices are easily and freely accessible, informative, and transparent. We mined the Retraction Watch database and evaluated a total of 768 retraction notices from two publishers (Springer and Wiley) over three years (2010, 2015, and 2020). Per our rubric, both publishers tended to score higher on measures of openness/availability, accessibility, and clarity as to why a paper was retracted than they did in: acknowledging institutional investigations; confirming whether there was consensus among authors; and specifying which parts of any given paper warranted retraction. Springer retraction notices appeared to improve over time with respect to the rubric's seven criteria. We observed some discrepancies among raters, indicating the difficulty in developing a robust objective rubric for evaluating retraction notices.</p>\",\"PeriodicalId\":50927,\"journal\":{\"name\":\"Accountability in Research-Policies and Quality Assurance\",\"volume\":\" \",\"pages\":\"1-12\"},\"PeriodicalIF\":2.8000,\"publicationDate\":\"2024-06-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Accountability in Research-Policies and Quality Assurance\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://doi.org/10.1080/08989621.2024.2366281\",\"RegionNum\":1,\"RegionCategory\":\"哲学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MEDICAL ETHICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Accountability in Research-Policies and Quality Assurance","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1080/08989621.2024.2366281","RegionNum":1,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MEDICAL ETHICS","Score":null,"Total":0}
Taking it back: A pilot study of a rubric measuring retraction notice quality.
The frequency of scientific retractions has grown substantially in recent years. However, thus far there is no standardized retraction notice format to which journals and their publishers adhere voluntarily, let alone compulsorily. We developed a rubric specifying seven criteria in order to judge whether retraction notices are easily and freely accessible, informative, and transparent. We mined the Retraction Watch database and evaluated a total of 768 retraction notices from two publishers (Springer and Wiley) over three years (2010, 2015, and 2020). Per our rubric, both publishers tended to score higher on measures of openness/availability, accessibility, and clarity as to why a paper was retracted than they did in: acknowledging institutional investigations; confirming whether there was consensus among authors; and specifying which parts of any given paper warranted retraction. Springer retraction notices appeared to improve over time with respect to the rubric's seven criteria. We observed some discrepancies among raters, indicating the difficulty in developing a robust objective rubric for evaluating retraction notices.
期刊介绍:
Accountability in Research: Policies and Quality Assurance is devoted to the examination and critical analysis of systems for maximizing integrity in the conduct of research. It provides an interdisciplinary, international forum for the development of ethics, procedures, standards policies, and concepts to encourage the ethical conduct of research and to enhance the validity of research results.
The journal welcomes views on advancing the integrity of research in the fields of general and multidisciplinary sciences, medicine, law, economics, statistics, management studies, public policy, politics, sociology, history, psychology, philosophy, ethics, and information science.
All submitted manuscripts are subject to initial appraisal by the Editor, and if found suitable for further consideration, to peer review by independent, anonymous expert referees.