Publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication.

IF 7.2 Q1 ETHICS Research integrity and peer review Pub Date : 2020-07-14 eCollection Date: 2020-01-01 DOI:10.1186/s41073-020-00095-y
Markus Konkol, Daniel Nüst, Laura Goulier
{"title":"Publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication.","authors":"Markus Konkol, Daniel Nüst, Laura Goulier","doi":"10.1186/s41073-020-00095-y","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>The trend toward open science increases the pressure on authors to provide access to the source code and data they used to compute the results reported in their scientific papers. Since sharing materials reproducibly is challenging, several projects have developed solutions to support the release of executable analyses alongside articles.</p><p><strong>Methods: </strong>We reviewed 11 applications that can assist researchers in adhering to reproducibility principles. The applications were found through a literature search and interactions with the reproducible research community. An application was included in our analysis if it <b>(i)</b> was actively maintained at the time the data for this paper was collected, <b>(ii)</b> supports the publication of executable code and data, <b>(iii)</b> is connected to the scholarly publication process. By investigating the software documentation and published articles, we compared the applications across 19 criteria, such as deployment options and features that support authors in creating and readers in studying executable papers.</p><p><strong>Results: </strong>From the 11 applications, eight allow publishers to self-host the system for free, whereas three provide paid services. Authors can submit an executable analysis using Jupyter Notebooks or R Markdown documents (10 applications support these formats). All approaches provide features to assist readers in studying the materials, e.g., one-click reproducible results or tools for manipulating the analysis parameters. Six applications allow for modifying materials after publication.</p><p><strong>Conclusions: </strong>The applications support authors to publish reproducible research predominantly with literate programming. Concerning readers, most applications provide user interfaces to inspect and manipulate the computational analysis. The next step is to investigate the gaps identified in this review, such as the costs publishers have to expect when hosting an application, the consideration of sensitive data, and impacts on the review process.</p>","PeriodicalId":74682,"journal":{"name":"Research integrity and peer review","volume":null,"pages":null},"PeriodicalIF":7.2000,"publicationDate":"2020-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s41073-020-00095-y","citationCount":"27","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research integrity and peer review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1186/s41073-020-00095-y","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2020/1/1 0:00:00","PubModel":"eCollection","JCR":"Q1","JCRName":"ETHICS","Score":null,"Total":0}
引用次数: 27

Abstract

Background: The trend toward open science increases the pressure on authors to provide access to the source code and data they used to compute the results reported in their scientific papers. Since sharing materials reproducibly is challenging, several projects have developed solutions to support the release of executable analyses alongside articles.

Methods: We reviewed 11 applications that can assist researchers in adhering to reproducibility principles. The applications were found through a literature search and interactions with the reproducible research community. An application was included in our analysis if it (i) was actively maintained at the time the data for this paper was collected, (ii) supports the publication of executable code and data, (iii) is connected to the scholarly publication process. By investigating the software documentation and published articles, we compared the applications across 19 criteria, such as deployment options and features that support authors in creating and readers in studying executable papers.

Results: From the 11 applications, eight allow publishers to self-host the system for free, whereas three provide paid services. Authors can submit an executable analysis using Jupyter Notebooks or R Markdown documents (10 applications support these formats). All approaches provide features to assist readers in studying the materials, e.g., one-click reproducible results or tools for manipulating the analysis parameters. Six applications allow for modifying materials after publication.

Conclusions: The applications support authors to publish reproducible research predominantly with literate programming. Concerning readers, most applications provide user interfaces to inspect and manipulate the computational analysis. The next step is to investigate the gaps identified in this review, such as the costs publishers have to expect when hosting an application, the consideration of sensitive data, and impacts on the review process.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
出版计算研究——对可复制和透明学术交流的基础设施的回顾。
背景:开放科学的趋势增加了作者提供源代码和数据的压力,这些源代码和数据是他们用来计算科学论文中报告的结果的。由于可重复地共享资料是一项挑战,因此一些项目已经开发出了解决方案,以支持在发布文章的同时发布可执行的分析。方法:我们回顾了11个可以帮助研究人员遵守可重复性原则的应用程序。这些应用程序是通过文献检索和与可重复研究社区的互动发现的。如果应用程序(i)在收集本文数据时积极维护,(ii)支持可执行代码和数据的发布,(iii)与学术出版过程相关,则该应用程序将被纳入我们的分析。通过调查软件文档和发表的文章,我们比较了19个标准下的应用程序,比如支持作者创建和读者研究可执行文件的部署选项和特性。结果:在11个应用程序中,8个允许出版商免费自行托管系统,而3个提供付费服务。作者可以使用Jupyter notebook或R Markdown文档提交可执行的分析(10个应用程序支持这些格式)。所有方法提供的功能,以帮助读者在研究材料,例如,一键可重复的结果或工具,用于操纵分析参数。六个应用程序允许在出版后修改材料。结论:应用程序支持作者主要通过文字编程发表可重复性研究。关于阅读器,大多数应用程序提供用户界面来检查和操作计算分析。下一步是调查审查中发现的差距,例如出版商在托管应用程序时必须预期的成本,对敏感数据的考虑以及对审查过程的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
审稿时长
5 weeks
期刊最新文献
Knowledge and practices of plagiarism among journal editors of Nepal. Perceptions, experiences, and motivation of COVID-19 vaccine trial participants in South Africa: a qualitative study. Peer review trends in six fisheries science journals. Enhancing reporting through structure: a before and after study on the effectiveness of SPIRIT-based templates to improve the completeness of reporting of randomized controlled trial protocols. Promoting equality, diversity and inclusion in research and funding: reflections from a digital manufacturing research network.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1