An evaluation of the preprints produced at the beginning of the 2022 mpox public health emergency.

Melanie Sterian, Anmol Samra, Kusala Pussegoda, Tricia Corrin, Mavra Qamar, Austyn Baumeister, Izza Israr, Lisa Waddell
{"title":"An evaluation of the preprints produced at the beginning of the 2022 mpox public health emergency.","authors":"Melanie Sterian, Anmol Samra, Kusala Pussegoda, Tricia Corrin, Mavra Qamar, Austyn Baumeister, Izza Israr, Lisa Waddell","doi":"10.1186/s41073-024-00152-w","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Preprints are scientific articles that have not undergone the peer-review process. They allow the latest evidence to be rapidly shared, however it is unclear whether they can be confidently used for decision-making during a public health emergency. This study aimed to compare the data and quality of preprints released during the first four months of the 2022 mpox outbreak to their published versions.</p><p><strong>Methods: </strong>Eligible preprints (n = 76) posted between May to August 2022 were identified through an established mpox literature database and followed to July 2024 for changes in publication status. Quality of preprints and published studies was assessed by two independent reviewers to evaluate changes in quality, using validated tools that were available for the study design (n = 33). Tools included the Newcastle-Ottawa Scale; Quality Assessment of Diagnostic Accuracy Studies 2 (QUADAS-2); and JBI Critical Appraisal Checklists. The questions in each tool led to an overall quality assessment of high quality (no concerns with study design, conduct, and/or analysis), moderate quality (minor concerns) or low quality (several concerns). Changes in data (e.g. methods, outcomes, results) for preprint-published pairs (n = 60) were assessed by one reviewer and verified by a second.</p><p><strong>Results: </strong>Preprints and published versions that could be evaluated for quality (n = 25 pairs) were mostly assessed as low quality. Minimal to no change in quality from preprint to published was identified: all observational studies (10/10), most case series (6/7) and all surveillance data analyses (3/3) had no change in overall quality, while some diagnostic test accuracy studies (3/5) improved or worsened their quality assessment scores. Among all pairs (n = 60), outcomes were often added in the published version (58%) and less commonly removed (18%). Numerical results changed from preprint to published in 53% of studies, however most of these studies (22/32) had changes that were minor and did not impact main conclusions of the study.</p><p><strong>Conclusions: </strong>This study suggests the minimal changes in quality, results and main conclusions from preprint to published versions supports the use of preprints, and the use of the same critical evaluation tools on preprints as applied to published studies, in decision-making during a public health emergency.</p>","PeriodicalId":74682,"journal":{"name":"Research integrity and peer review","volume":"9 1","pages":"11"},"PeriodicalIF":7.2000,"publicationDate":"2024-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11457328/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research integrity and peer review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1186/s41073-024-00152-w","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ETHICS","Score":null,"Total":0}
引用次数: 0

Abstract

Background: Preprints are scientific articles that have not undergone the peer-review process. They allow the latest evidence to be rapidly shared, however it is unclear whether they can be confidently used for decision-making during a public health emergency. This study aimed to compare the data and quality of preprints released during the first four months of the 2022 mpox outbreak to their published versions.

Methods: Eligible preprints (n = 76) posted between May to August 2022 were identified through an established mpox literature database and followed to July 2024 for changes in publication status. Quality of preprints and published studies was assessed by two independent reviewers to evaluate changes in quality, using validated tools that were available for the study design (n = 33). Tools included the Newcastle-Ottawa Scale; Quality Assessment of Diagnostic Accuracy Studies 2 (QUADAS-2); and JBI Critical Appraisal Checklists. The questions in each tool led to an overall quality assessment of high quality (no concerns with study design, conduct, and/or analysis), moderate quality (minor concerns) or low quality (several concerns). Changes in data (e.g. methods, outcomes, results) for preprint-published pairs (n = 60) were assessed by one reviewer and verified by a second.

Results: Preprints and published versions that could be evaluated for quality (n = 25 pairs) were mostly assessed as low quality. Minimal to no change in quality from preprint to published was identified: all observational studies (10/10), most case series (6/7) and all surveillance data analyses (3/3) had no change in overall quality, while some diagnostic test accuracy studies (3/5) improved or worsened their quality assessment scores. Among all pairs (n = 60), outcomes were often added in the published version (58%) and less commonly removed (18%). Numerical results changed from preprint to published in 53% of studies, however most of these studies (22/32) had changes that were minor and did not impact main conclusions of the study.

Conclusions: This study suggests the minimal changes in quality, results and main conclusions from preprint to published versions supports the use of preprints, and the use of the same critical evaluation tools on preprints as applied to published studies, in decision-making during a public health emergency.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
对 2022 年 mpox 公共卫生紧急事件开始时制作的预印本进行评估。
背景介绍预印本是未经同行评审的科学文章。预印本允许快速分享最新的证据,但在公共卫生突发事件中,预印本是否可用于决策尚不清楚。本研究旨在比较2022年麻疹疫情爆发前四个月发布的预印本与其出版版本的数据和质量:通过已建立的麻痘文献数据库确定了2022年5月至8月间发布的符合条件的预印本(n = 76),并跟踪至2024年7月,以了解出版状态的变化。预印本和已发表研究报告的质量由两名独立审稿人进行评估,使用可用于研究设计的有效工具(n = 33)评估质量变化。评估工具包括纽卡斯尔-渥太华量表(Newcastle-Ottawa Scale)、诊断准确性研究质量评估2(QUADAS-2)和JBI批判性评估检查表(JBI Critical Appraisal Checklists)。根据每个工具中的问题,总体质量评估结果为高质量(研究设计、实施和/或分析无问题)、中等质量(轻微问题)或低质量(若干问题)。预印本和出版版本(n = 60)的数据变化(如方法、结果、结果)由一位审稿人评估,并由第二位审稿人核实:可进行质量评估的预印本和出版版本(n = 25 对)大多被评定为低质量。从预印本到出版版本的质量变化极小或没有变化:所有观察性研究(10/10)、大多数病例系列(6/7)和所有监测数据分析(3/3)的总体质量没有变化,而一些诊断测试准确性研究(3/5)的质量评估分数有所提高或降低。在所有研究对(n = 60)中,结果通常在出版版本中添加(58%),较少被删除(18%)。53%的研究从预印版到出版版的数字结果发生了变化,但其中大部分研究(22/32)的变化都很小,不会影响研究的主要结论:本研究表明,从预印本到出版版本在质量、结果和主要结论方面的微小变化支持在公共卫生突发事件期间的决策过程中使用预印本,并对预印本使用与出版研究相同的批判性评估工具。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
审稿时长
5 weeks
期刊最新文献
Investigating the links between questionable research practices, scientific norms and organisational culture. An evaluation of the preprints produced at the beginning of the 2022 mpox public health emergency. Differences in the reporting of conflicts of interest and sponsorships in systematic reviews with meta-analyses in dentistry: an examination of factors associated with their reporting. Knowledge and practices of plagiarism among journal editors of Nepal. Perceptions, experiences, and motivation of COVID-19 vaccine trial participants in South Africa: a qualitative study.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1