List decoding with side information

V. Guruswami
{"title":"List decoding with side information","authors":"V. Guruswami","doi":"10.1109/CCC.2003.1214429","DOIUrl":null,"url":null,"abstract":"Under list decoding of error-correcting codes, the decoding algorithm is allowed to output a small list of codewords that are close to the noisy received word. This relaxation permits recovery even under very high noise thresholds. We consider one possible scenario that would permit disambiguating between the elements of the list, namely where the sender of the message provides some hopefully small amount of side information about the transmitted message on a separate auxiliary channel that is noise-free. This setting becomes meaningful and useful when the amount of side information that needs to be communicated is much smaller than the length of the message. We study what kind of side information is necessary and sufficient in the above context. The short, conceptual answer is that the side information must be randomized and the message recovery is with a small failure probability. Specifically, we prove that deterministic schemes, which guarantee correct recovery of the message, provide no savings and essentially the entire message has to be sent as side information. However there exist randomized schemes, which only need side information of length logarithmic in the message length. In fact, in the limit of repeated communication of several messages, amortized amount of side information needed per message can be a constant independent of the message length or the failure probability. Concretely, we can correct up to a fraction (1/2-/spl gamma/) of errors for binary codes using only 2log(1//spl gamma/)+O(1) amortized bits of side information per message, and this is in fact the best possible (up to additive constant terms).","PeriodicalId":286846,"journal":{"name":"18th IEEE Annual Conference on Computational Complexity, 2003. Proceedings.","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"36","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"18th IEEE Annual Conference on Computational Complexity, 2003. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCC.2003.1214429","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 36

Abstract

Under list decoding of error-correcting codes, the decoding algorithm is allowed to output a small list of codewords that are close to the noisy received word. This relaxation permits recovery even under very high noise thresholds. We consider one possible scenario that would permit disambiguating between the elements of the list, namely where the sender of the message provides some hopefully small amount of side information about the transmitted message on a separate auxiliary channel that is noise-free. This setting becomes meaningful and useful when the amount of side information that needs to be communicated is much smaller than the length of the message. We study what kind of side information is necessary and sufficient in the above context. The short, conceptual answer is that the side information must be randomized and the message recovery is with a small failure probability. Specifically, we prove that deterministic schemes, which guarantee correct recovery of the message, provide no savings and essentially the entire message has to be sent as side information. However there exist randomized schemes, which only need side information of length logarithmic in the message length. In fact, in the limit of repeated communication of several messages, amortized amount of side information needed per message can be a constant independent of the message length or the failure probability. Concretely, we can correct up to a fraction (1/2-/spl gamma/) of errors for binary codes using only 2log(1//spl gamma/)+O(1) amortized bits of side information per message, and this is in fact the best possible (up to additive constant terms).
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
带副信息的解码列表
在纠错码的列表译码下,允许译码算法输出一个与噪声接收字相近的小码字列表。即使在非常高的噪声阈值下,这种松弛也允许恢复。我们考虑了一种可能的场景,它允许在列表元素之间消除歧义,即消息的发送方在一个独立的无噪声辅助信道上提供有关所传输消息的一些有希望的少量侧信息。当需要传达的附加信息的数量远远小于消息的长度时,这个设置就变得有意义和有用了。我们研究在上述语境中,什么样的侧信息是必要的和充分的。简短的、概念性的回答是,侧信息必须是随机的,并且消息恢复具有很小的故障概率。具体来说,我们证明了保证消息正确恢复的确定性方案不提供节省,并且本质上整个消息必须作为副信息发送。但是也存在随机化方案,它只需要消息长度的对数边信息。实际上,在多条消息重复通信的限制下,每条消息所需的侧信息平摊量可以是一个常数,与消息长度或故障概率无关。具体地说,我们可以使用2log(1//spl gamma/)+O(1)平摊的每条消息的副信息位来纠正二进制代码的一小部分(1/2-/spl gamma/)错误,这实际上是最好的可能(直到可加常数项)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Inapproximability - some history and some open problems The complexity of stochastic sequences Quantum query complexity and semi-definite programming Lower bounds for predecessor searching in the cell probe model A strong inapproximability gap for a generalization of minimum bisection
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1