Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513855
J. Ziv
An important class of universal encoders is the one where the encoder is fed by two inputs: a) The incoming string of data to be compressed. b) An N-bit description of the source statistics (i.e. a "training sequence"). We consider fixed-to-variable universal encoders that noiselessly compress blocks of length l.
{"title":"Bounds on universal coding: the next generation","authors":"J. Ziv","doi":"10.1109/WITS.1994.513855","DOIUrl":"https://doi.org/10.1109/WITS.1994.513855","url":null,"abstract":"An important class of universal encoders is the one where the encoder is fed by two inputs: a) The incoming string of data to be compressed. b) An N-bit description of the source statistics (i.e. a \"training sequence\"). We consider fixed-to-variable universal encoders that noiselessly compress blocks of length l.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129856443","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513875
P. Hall, S. Lahiri, Y. Truong
The authors discuss the optimal bandwidth choice and optimal convergence rates for density estimation with dependent data, as the amount of information in the sample is altered by adjusting the range of dependence. They assume that data are observed from a stationary stochastic process that may be taken to be an unknown function of a Gaussian process.
{"title":"Bandwidth choice and convergence rates in density estimation with long-range dependent data","authors":"P. Hall, S. Lahiri, Y. Truong","doi":"10.1109/WITS.1994.513875","DOIUrl":"https://doi.org/10.1109/WITS.1994.513875","url":null,"abstract":"The authors discuss the optimal bandwidth choice and optimal convergence rates for density estimation with dependent data, as the amount of information in the sample is altered by adjusting the range of dependence. They assume that data are observed from a stationary stochastic process that may be taken to be an unknown function of a Gaussian process.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126648795","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513910
J. Chapa, M. Raghuveer
A limitation to wavelet design is the inability to construct orthonormal wavelets that match or are "tuned" to a desired signal. This paper develops a technique for constructing an orthonormal wavelet that is optimized in the least squares sense, and whose associated scaling function generates an orthonormal multiresolution analysis (OMRA).
{"title":"Constructing wavelets from desired signal functions","authors":"J. Chapa, M. Raghuveer","doi":"10.1109/WITS.1994.513910","DOIUrl":"https://doi.org/10.1109/WITS.1994.513910","url":null,"abstract":"A limitation to wavelet design is the inability to construct orthonormal wavelets that match or are \"tuned\" to a desired signal. This paper develops a technique for constructing an orthonormal wavelet that is optimized in the least squares sense, and whose associated scaling function generates an orthonormal multiresolution analysis (OMRA).","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124697263","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513914
M. Foodeei, E. Dubois
The goal of this work is to analyze the advantages of the recently introduced entropy-constrained code-excited linear predictive (EC-CELP) quantization. The analysis is at low rates in comparison with other EC quantization schemes. Based on N-th order rate-distortion function (RDF), EC quantization theory, and empirical methods, RDF memory gain and empirical space-filling gain (dimensionality N) at low bit rates are defined and calculated. These gains categorize and help us analyze and compare the available coding gains for various EC coders for a given rate and delay (N).
{"title":"Quantization theory and EC-CELP advantages at low bit rates","authors":"M. Foodeei, E. Dubois","doi":"10.1109/WITS.1994.513914","DOIUrl":"https://doi.org/10.1109/WITS.1994.513914","url":null,"abstract":"The goal of this work is to analyze the advantages of the recently introduced entropy-constrained code-excited linear predictive (EC-CELP) quantization. The analysis is at low rates in comparison with other EC quantization schemes. Based on N-th order rate-distortion function (RDF), EC quantization theory, and empirical methods, RDF memory gain and empirical space-filling gain (dimensionality N) at low bit rates are defined and calculated. These gains categorize and help us analyze and compare the available coding gains for various EC coders for a given rate and delay (N).","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122528505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513878
L. Pitt
We present a survey of design problems and results that arise in the prediction and parameter estimation of stochastic partial differential equations (SPDES). The aim is to better understand some unavoidable errors that occur in the discretization of SPDEs, and available methods for minimizing these errors.
{"title":"Estimation and prediction for (mostly Gaussian) Markov fields in the continuum","authors":"L. Pitt","doi":"10.1109/WITS.1994.513878","DOIUrl":"https://doi.org/10.1109/WITS.1994.513878","url":null,"abstract":"We present a survey of design problems and results that arise in the prediction and parameter estimation of stochastic partial differential equations (SPDES). The aim is to better understand some unavoidable errors that occur in the discretization of SPDEs, and available methods for minimizing these errors.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"147 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123440398","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513852
D. Geman
We consider a computational strategy for shape recognition based on choosing "tests" one at a time in order to remove as much uncertainty as possible about the true hypothesis. The approach is compared with other recognition paradigms in computer vision and illustrated by attempting to classify handwritten digits and track roads from satellite images.
{"title":"The entropy strategy for shape recognition","authors":"D. Geman","doi":"10.1109/WITS.1994.513852","DOIUrl":"https://doi.org/10.1109/WITS.1994.513852","url":null,"abstract":"We consider a computational strategy for shape recognition based on choosing \"tests\" one at a time in order to remove as much uncertainty as possible about the true hypothesis. The approach is compared with other recognition paradigms in computer vision and illustrated by attempting to classify handwritten digits and track roads from satellite images.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128420800","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513915
R. L. Fry
A model is proposed in which the neuron serves as an information channel. An application of the Shannon information measures of entropy and mutual information taken together in the context of the proposed model lead to the Hopfield neuron model with a conditionalized Hebbian learning rule and sigmoidal transfer characteristic.
{"title":"Neural processing of information","authors":"R. L. Fry","doi":"10.1109/WITS.1994.513915","DOIUrl":"https://doi.org/10.1109/WITS.1994.513915","url":null,"abstract":"A model is proposed in which the neuron serves as an information channel. An application of the Shannon information measures of entropy and mutual information taken together in the context of the proposed model lead to the Hopfield neuron model with a conditionalized Hebbian learning rule and sigmoidal transfer characteristic.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125940061","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513856
B. S. Clarke, A. Barron
We determine the asymptotic minimax redundancy of universal data compression in a parametric setting and show that it corresponds to the use of Jeffreys prior. Statistically, this formulation of the coding problem can be interpreted in a prior selection context and in an estimation context.
{"title":"Jeffreys' prior yields the asymptotic minimax redundancy","authors":"B. S. Clarke, A. Barron","doi":"10.1109/WITS.1994.513856","DOIUrl":"https://doi.org/10.1109/WITS.1994.513856","url":null,"abstract":"We determine the asymptotic minimax redundancy of universal data compression in a parametric setting and show that it corresponds to the use of Jeffreys prior. Statistically, this formulation of the coding problem can be interpreted in a prior selection context and in an estimation context.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125828205","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513870
M. Burnashev, S. Verdú
A geometrical approach to ID-codes, based on their equivalence to some natural notions from mathematical statistics is described. That not only enlarges the available analytical apparatus, but also enables us to strengthen some known results.
{"title":"Testing of composite hypotheses and ID-codes","authors":"M. Burnashev, S. Verdú","doi":"10.1109/WITS.1994.513870","DOIUrl":"https://doi.org/10.1109/WITS.1994.513870","url":null,"abstract":"A geometrical approach to ID-codes, based on their equivalence to some natural notions from mathematical statistics is described. That not only enlarges the available analytical apparatus, but also enables us to strengthen some known results.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133337150","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513918
D. Lake
Detecting minefields in the presence of clutter is an important challenge for the Navy. Mine-fields have point patterns that tend to be regular for a variety of reasons including strategic doctrine, safety, tactical efficiency, and perhaps most intriguing the human element. For example, humans have a tendency to make lottery number selections, a one-dimensional discrete point process, in a non-uniform manner. In this paper, we introduce several simple procedures to detect regularity in point processes.
{"title":"Detecting regularity in point processes generated by humans","authors":"D. Lake","doi":"10.1109/WITS.1994.513918","DOIUrl":"https://doi.org/10.1109/WITS.1994.513918","url":null,"abstract":"Detecting minefields in the presence of clutter is an important challenge for the Navy. Mine-fields have point patterns that tend to be regular for a variety of reasons including strategic doctrine, safety, tactical efficiency, and perhaps most intriguing the human element. For example, humans have a tendency to make lottery number selections, a one-dimensional discrete point process, in a non-uniform manner. In this paper, we introduce several simple procedures to detect regularity in point processes.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132717656","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}