报告结果在高能物理出版物:一个宣言

Q1 Physics and Astronomy Reviews in Physics Pub Date : 2020-11-01 DOI:10.1016/j.revip.2020.100046
Pietro Vischia
{"title":"报告结果在高能物理出版物:一个宣言","authors":"Pietro Vischia","doi":"10.1016/j.revip.2020.100046","DOIUrl":null,"url":null,"abstract":"<div><p>The complexity of collider data analyses has dramatically increased from early colliders to the CERN LHC. Reconstruction of the collision products in the particle detectors has reached a point that requires dedicated publications documenting the techniques, and periodic retuning of the algorithms themselves. Analysis methods evolved to account for the increased complexity of the combination of particles required in each collision event (final states) and for the need of squeezing every last bit of sensitivity from the data; physicists often seek to fully reconstruct the final state, a process that is mostly relatively easy at lepton colliders but sometimes exceedingly difficult at hadron colliders to the point of requiring sometimes using advanced statistical techniques such as machine learning. The need for keeping the publications documenting results to a reasonable size implies a greater level of compression or even omission of information with respect to publications from twenty years ago. The need for compression should however not prevent sharing a reasonable amount of information that is essential to understanding a given analysis. Infrastructures like <span>Rivet</span> or <span>HepData</span> have been developed to host additional material, but physicists in the experimental Collaborations often still send an insufficient amount of material to these databases. In this manuscript I advocate for an increase in the information shared by the Collaborations, and try to define a minimum standard for acceptable level of information when reporting the results of statistical procedures in High Energy Physics publications.</p></div>","PeriodicalId":37875,"journal":{"name":"Reviews in Physics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.revip.2020.100046","citationCount":"5","resultStr":"{\"title\":\"Reporting results in High Energy Physics publications: A manifesto\",\"authors\":\"Pietro Vischia\",\"doi\":\"10.1016/j.revip.2020.100046\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>The complexity of collider data analyses has dramatically increased from early colliders to the CERN LHC. Reconstruction of the collision products in the particle detectors has reached a point that requires dedicated publications documenting the techniques, and periodic retuning of the algorithms themselves. Analysis methods evolved to account for the increased complexity of the combination of particles required in each collision event (final states) and for the need of squeezing every last bit of sensitivity from the data; physicists often seek to fully reconstruct the final state, a process that is mostly relatively easy at lepton colliders but sometimes exceedingly difficult at hadron colliders to the point of requiring sometimes using advanced statistical techniques such as machine learning. The need for keeping the publications documenting results to a reasonable size implies a greater level of compression or even omission of information with respect to publications from twenty years ago. The need for compression should however not prevent sharing a reasonable amount of information that is essential to understanding a given analysis. Infrastructures like <span>Rivet</span> or <span>HepData</span> have been developed to host additional material, but physicists in the experimental Collaborations often still send an insufficient amount of material to these databases. In this manuscript I advocate for an increase in the information shared by the Collaborations, and try to define a minimum standard for acceptable level of information when reporting the results of statistical procedures in High Energy Physics publications.</p></div>\",\"PeriodicalId\":37875,\"journal\":{\"name\":\"Reviews in Physics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1016/j.revip.2020.100046\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Reviews in Physics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2405428320300095\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Physics and Astronomy\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Reviews in Physics","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2405428320300095","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Physics and Astronomy","Score":null,"Total":0}
引用次数: 5

摘要

从早期的对撞机到欧洲核子研究中心的大型强子对撞机,对撞机数据分析的复杂性急剧增加。粒子探测器中碰撞产物的重建已经达到了需要专门的出版物记录技术和算法本身的周期性返回的程度。分析方法的演变是为了解释每次碰撞事件(最终状态)所需的粒子组合的复杂性的增加,以及从数据中挤出最后一点灵敏度的需要;物理学家经常试图完全重建最终状态,这一过程在轻子对撞机上相对容易,但在强子对撞机上有时却极其困难,以至于有时需要使用机器学习等先进的统计技术。需要将记录结果的出版物保持在合理的大小,这意味着与20年前的出版物相比,信息被压缩得更大,甚至被遗漏。然而,对压缩的需求不应妨碍共享合理数量的信息,这些信息对于理解给定的分析至关重要。像Rivet或HepData这样的基础设施已经被开发出来存放额外的材料,但是实验合作的物理学家们仍然经常向这些数据库发送数量不足的材料。在这篇文章中,我主张增加协作共享的信息,并尝试在高能物理出版物中报告统计过程结果时定义可接受信息水平的最低标准。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Reporting results in High Energy Physics publications: A manifesto

The complexity of collider data analyses has dramatically increased from early colliders to the CERN LHC. Reconstruction of the collision products in the particle detectors has reached a point that requires dedicated publications documenting the techniques, and periodic retuning of the algorithms themselves. Analysis methods evolved to account for the increased complexity of the combination of particles required in each collision event (final states) and for the need of squeezing every last bit of sensitivity from the data; physicists often seek to fully reconstruct the final state, a process that is mostly relatively easy at lepton colliders but sometimes exceedingly difficult at hadron colliders to the point of requiring sometimes using advanced statistical techniques such as machine learning. The need for keeping the publications documenting results to a reasonable size implies a greater level of compression or even omission of information with respect to publications from twenty years ago. The need for compression should however not prevent sharing a reasonable amount of information that is essential to understanding a given analysis. Infrastructures like Rivet or HepData have been developed to host additional material, but physicists in the experimental Collaborations often still send an insufficient amount of material to these databases. In this manuscript I advocate for an increase in the information shared by the Collaborations, and try to define a minimum standard for acceptable level of information when reporting the results of statistical procedures in High Energy Physics publications.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Reviews in Physics
Reviews in Physics Physics and Astronomy-Physics and Astronomy (all)
CiteScore
21.30
自引率
0.00%
发文量
8
审稿时长
98 days
期刊介绍: Reviews in Physics is a gold open access Journal, publishing review papers on topics in all areas of (applied) physics. The journal provides a platform for researchers who wish to summarize a field of physics research and share this work as widely as possible. The published papers provide an overview of the main developments on a particular topic, with an emphasis on recent developments, and sketch an outlook on future developments. The journal focuses on short review papers (max 15 pages) and these are freely available after publication. All submitted manuscripts are fully peer-reviewed and after acceptance a publication fee is charged to cover all editorial, production, and archiving costs.
期刊最新文献
Localization in quantum field theory Deep generative models for detector signature simulation: A taxonomic review Magnetism on frustrated magnet system of Nd2B2O7 (B = Ru, Ir, Hf, Pb, Mo, and Zr): A systematic literature review A photonics perspective on computing with physical substrates Machine learning for anomaly detection in particle physics
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1