{"title":"ALICE运行的快速熵编码","authors":"M. Lettrich","doi":"10.22323/1.390.0913","DOIUrl":null,"url":null,"abstract":"In LHC Run 3, the upgraded ALICE detector will record Pb-Pb collisions at a rate of 50 kHz using continuous readout. The resulting stream of raw data at 3.5 TB/s has to be processed with a set of lossy and lossless compression and data reduction techniques to a storage data rate of 90 GB/s while preserving relevant data for physics analysis. This contribution presents a custom lossless data compression scheme based on entropy coding as the final component in the data reduction chain which has to compress the data rate from 300 GB/s to 90 GB/s. A flexible, multi-process architecture for the data compression scheme is proposed that seamlessly interfaces with the data reduction algorithms of earlier stages and allows to use parallel processing in order to keep the required firm real-time guarantees of the system. The data processed inside the compression process have a structure that allows the use of an rANS entropy coder with more resource efficient static distribution tables. Extensions to the rANS entropy coder are introduced to efficiently work with these static distribution tables and large but sparse source alphabets consisting of up to 25 Bit per symbol. Preliminary performance results show compliance with the firm real-time requirements while offering close-to-optimal data compression.","PeriodicalId":20428,"journal":{"name":"Proceedings of 40th International Conference on High Energy physics — PoS(ICHEP2020)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2021-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Fast Entropy Coding for ALICE Run 3\",\"authors\":\"M. Lettrich\",\"doi\":\"10.22323/1.390.0913\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In LHC Run 3, the upgraded ALICE detector will record Pb-Pb collisions at a rate of 50 kHz using continuous readout. The resulting stream of raw data at 3.5 TB/s has to be processed with a set of lossy and lossless compression and data reduction techniques to a storage data rate of 90 GB/s while preserving relevant data for physics analysis. This contribution presents a custom lossless data compression scheme based on entropy coding as the final component in the data reduction chain which has to compress the data rate from 300 GB/s to 90 GB/s. A flexible, multi-process architecture for the data compression scheme is proposed that seamlessly interfaces with the data reduction algorithms of earlier stages and allows to use parallel processing in order to keep the required firm real-time guarantees of the system. The data processed inside the compression process have a structure that allows the use of an rANS entropy coder with more resource efficient static distribution tables. Extensions to the rANS entropy coder are introduced to efficiently work with these static distribution tables and large but sparse source alphabets consisting of up to 25 Bit per symbol. Preliminary performance results show compliance with the firm real-time requirements while offering close-to-optimal data compression.\",\"PeriodicalId\":20428,\"journal\":{\"name\":\"Proceedings of 40th International Conference on High Energy physics — PoS(ICHEP2020)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-02-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 40th International Conference on High Energy physics — PoS(ICHEP2020)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.22323/1.390.0913\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 40th International Conference on High Energy physics — PoS(ICHEP2020)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.22323/1.390.0913","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

在LHC Run 3中,升级后的ALICE探测器将以50 kHz的速率连续读出记录Pb-Pb碰撞。由此产生的3.5 TB/s的原始数据流必须通过一组有损和无损压缩和数据缩减技术进行处理,使存储数据速率达到90 GB/s,同时保留相关数据用于物理分析。本文提出了一种基于熵编码的自定义无损数据压缩方案,该方案作为数据缩减链的最后组成部分,必须将数据速率从300 GB/s压缩到90 GB/s。提出了一种灵活的多进程数据压缩方案架构,该架构与早期阶段的数据缩减算法无缝接口,并允许使用并行处理,以保持系统所需的坚固实时性保证。在压缩过程中处理的数据具有一种结构,该结构允许使用具有更高效资源的静态分布表的rANS熵编码器。引入了对rANS熵编码器的扩展,以有效地处理这些静态分布表和由每个符号最多25位组成的大型但稀疏的源字母表。初步的性能结果表明,在提供接近最佳的数据压缩的同时,符合公司的实时要求。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Fast Entropy Coding for ALICE Run 3
In LHC Run 3, the upgraded ALICE detector will record Pb-Pb collisions at a rate of 50 kHz using continuous readout. The resulting stream of raw data at 3.5 TB/s has to be processed with a set of lossy and lossless compression and data reduction techniques to a storage data rate of 90 GB/s while preserving relevant data for physics analysis. This contribution presents a custom lossless data compression scheme based on entropy coding as the final component in the data reduction chain which has to compress the data rate from 300 GB/s to 90 GB/s. A flexible, multi-process architecture for the data compression scheme is proposed that seamlessly interfaces with the data reduction algorithms of earlier stages and allows to use parallel processing in order to keep the required firm real-time guarantees of the system. The data processed inside the compression process have a structure that allows the use of an rANS entropy coder with more resource efficient static distribution tables. Extensions to the rANS entropy coder are introduced to efficiently work with these static distribution tables and large but sparse source alphabets consisting of up to 25 Bit per symbol. Preliminary performance results show compliance with the firm real-time requirements while offering close-to-optimal data compression.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Supersymmetric theories and graphene Search for long range flow-like correlation in hadronic $e^{+}e^{-}$ collisions with Belle Status and progress of the JUNO detector The status of the R&D of Ultra Fast 8 times 8 Readout MCP-PMTs in IHEP Characterization of ALPIDE silicon sensors with inclined tracks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1