{"title":"带副信息的解码列表","authors":"V. Guruswami","doi":"10.1109/CCC.2003.1214429","DOIUrl":null,"url":null,"abstract":"Under list decoding of error-correcting codes, the decoding algorithm is allowed to output a small list of codewords that are close to the noisy received word. This relaxation permits recovery even under very high noise thresholds. We consider one possible scenario that would permit disambiguating between the elements of the list, namely where the sender of the message provides some hopefully small amount of side information about the transmitted message on a separate auxiliary channel that is noise-free. This setting becomes meaningful and useful when the amount of side information that needs to be communicated is much smaller than the length of the message. We study what kind of side information is necessary and sufficient in the above context. The short, conceptual answer is that the side information must be randomized and the message recovery is with a small failure probability. Specifically, we prove that deterministic schemes, which guarantee correct recovery of the message, provide no savings and essentially the entire message has to be sent as side information. However there exist randomized schemes, which only need side information of length logarithmic in the message length. In fact, in the limit of repeated communication of several messages, amortized amount of side information needed per message can be a constant independent of the message length or the failure probability. Concretely, we can correct up to a fraction (1/2-/spl gamma/) of errors for binary codes using only 2log(1//spl gamma/)+O(1) amortized bits of side information per message, and this is in fact the best possible (up to additive constant terms).","PeriodicalId":286846,"journal":{"name":"18th IEEE Annual Conference on Computational Complexity, 2003. Proceedings.","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"36","resultStr":"{\"title\":\"List decoding with side information\",\"authors\":\"V. Guruswami\",\"doi\":\"10.1109/CCC.2003.1214429\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Under list decoding of error-correcting codes, the decoding algorithm is allowed to output a small list of codewords that are close to the noisy received word. This relaxation permits recovery even under very high noise thresholds. We consider one possible scenario that would permit disambiguating between the elements of the list, namely where the sender of the message provides some hopefully small amount of side information about the transmitted message on a separate auxiliary channel that is noise-free. This setting becomes meaningful and useful when the amount of side information that needs to be communicated is much smaller than the length of the message. We study what kind of side information is necessary and sufficient in the above context. The short, conceptual answer is that the side information must be randomized and the message recovery is with a small failure probability. Specifically, we prove that deterministic schemes, which guarantee correct recovery of the message, provide no savings and essentially the entire message has to be sent as side information. However there exist randomized schemes, which only need side information of length logarithmic in the message length. In fact, in the limit of repeated communication of several messages, amortized amount of side information needed per message can be a constant independent of the message length or the failure probability. Concretely, we can correct up to a fraction (1/2-/spl gamma/) of errors for binary codes using only 2log(1//spl gamma/)+O(1) amortized bits of side information per message, and this is in fact the best possible (up to additive constant terms).\",\"PeriodicalId\":286846,\"journal\":{\"name\":\"18th IEEE Annual Conference on Computational Complexity, 2003. Proceedings.\",\"volume\":\"28 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2003-07-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"36\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"18th IEEE Annual Conference on Computational Complexity, 2003. Proceedings.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CCC.2003.1214429\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"18th IEEE Annual Conference on Computational Complexity, 2003. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCC.2003.1214429","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Under list decoding of error-correcting codes, the decoding algorithm is allowed to output a small list of codewords that are close to the noisy received word. This relaxation permits recovery even under very high noise thresholds. We consider one possible scenario that would permit disambiguating between the elements of the list, namely where the sender of the message provides some hopefully small amount of side information about the transmitted message on a separate auxiliary channel that is noise-free. This setting becomes meaningful and useful when the amount of side information that needs to be communicated is much smaller than the length of the message. We study what kind of side information is necessary and sufficient in the above context. The short, conceptual answer is that the side information must be randomized and the message recovery is with a small failure probability. Specifically, we prove that deterministic schemes, which guarantee correct recovery of the message, provide no savings and essentially the entire message has to be sent as side information. However there exist randomized schemes, which only need side information of length logarithmic in the message length. In fact, in the limit of repeated communication of several messages, amortized amount of side information needed per message can be a constant independent of the message length or the failure probability. Concretely, we can correct up to a fraction (1/2-/spl gamma/) of errors for binary codes using only 2log(1//spl gamma/)+O(1) amortized bits of side information per message, and this is in fact the best possible (up to additive constant terms).