{"title":"Bayes risk-weighted vector quantization","authors":"R. Gray","doi":"10.1109/WITS.1994.513847","DOIUrl":null,"url":null,"abstract":"Lossy compression and classification algorithms both attempt to reduce a large collection of possible observations into a few representative categories so as to preserve essential information. A framework for combining classification and compression into one or two quantizers is described along with some examples and related to other quantizer-based classification schemes.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"27 4","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 Workshop on Information Theory and Statistics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WITS.1994.513847","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
Lossy compression and classification algorithms both attempt to reduce a large collection of possible observations into a few representative categories so as to preserve essential information. A framework for combining classification and compression into one or two quantizers is described along with some examples and related to other quantizer-based classification schemes.