{"title":"Efficient universal lossless data compression algorithms based on a greedy context-dependent sequential grammar transform","authors":"E. Yang, Dake He","doi":"10.1109/ISIT.2001.935941","DOIUrl":null,"url":null,"abstract":"In many applications like compression of text files, Web page files, and Java applets, there exists some a-priori knowledge, which often takes form of context models, about the data to be compressed. The challenging problem is then how to efficiently utilize the context models to improve the compression performance. We address this problem by extending results of Yang and Kieffer (see IEEE Trans. Inform. Theory, vol.IT-46, p.755-88, 2000), particularly the greedy context-free sequential grammar transform and the corresponding compression algorithms, to the case of context models.","PeriodicalId":433761,"journal":{"name":"Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2001-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT.2001.935941","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
In many applications like compression of text files, Web page files, and Java applets, there exists some a-priori knowledge, which often takes form of context models, about the data to be compressed. The challenging problem is then how to efficiently utilize the context models to improve the compression performance. We address this problem by extending results of Yang and Kieffer (see IEEE Trans. Inform. Theory, vol.IT-46, p.755-88, 2000), particularly the greedy context-free sequential grammar transform and the corresponding compression algorithms, to the case of context models.