Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513924
T. Robert, J. Tourneret
Bayes decision theory is based on the assumption that the decision problem is posed in probabilistic terms, and that all of the relevant probability values are known. The aim of this paper is to show how blind sliding window AR modeling is corrupted by an abrupt model change and to derive a statistical study of these parameters.
{"title":"Continuously evolving classification of signals corrupted by an abrupt change","authors":"T. Robert, J. Tourneret","doi":"10.1109/WITS.1994.513924","DOIUrl":"https://doi.org/10.1109/WITS.1994.513924","url":null,"abstract":"Bayes decision theory is based on the assumption that the decision problem is posed in probabilistic terms, and that all of the relevant probability values are known. The aim of this paper is to show how blind sliding window AR modeling is corrupted by an abrupt model change and to derive a statistical study of these parameters.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133674996","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513864
Xiaolin Wu
Image compression is often approached from an angle of statistical image classification. For instance, VQ-based image coding methods compress image data by classifying image blocks into representative two-dimensional patterns (codewords) that statistically approximate the original data. Another image compression approach that naturally relates to image classification is segmentation-based image coding (SIC). In SIC, we classify pixels into segments of certain uniformity or similarity, and then encode the segmentation geometry and the attributes of the segments. Image segmentation in SIC has to meet some more stringent requirements than in other applications such as computer vision and pattern recognition. An efficient SIC coder has to strike a good balance between accurate semantics and succinct syntax of the segmentation. From a pure classification point of view, free form segmentation by relaxation, region-growing, or split-and-merge techniques offers an accurate boundary representation. But the resulting segmentation geometry is often too complex to have a compact description, defeating the purpose of image compression. Instead, we adopt a bintree-structured segmentation scheme. The bintree is a binary tree created by recursive rectilinear bipartition of an image.
{"title":"Image coding via bintree segmentation and texture VQ","authors":"Xiaolin Wu","doi":"10.1109/WITS.1994.513864","DOIUrl":"https://doi.org/10.1109/WITS.1994.513864","url":null,"abstract":"Image compression is often approached from an angle of statistical image classification. For instance, VQ-based image coding methods compress image data by classifying image blocks into representative two-dimensional patterns (codewords) that statistically approximate the original data. Another image compression approach that naturally relates to image classification is segmentation-based image coding (SIC). In SIC, we classify pixels into segments of certain uniformity or similarity, and then encode the segmentation geometry and the attributes of the segments. Image segmentation in SIC has to meet some more stringent requirements than in other applications such as computer vision and pattern recognition. An efficient SIC coder has to strike a good balance between accurate semantics and succinct syntax of the segmentation. From a pure classification point of view, free form segmentation by relaxation, region-growing, or split-and-merge techniques offers an accurate boundary representation. But the resulting segmentation geometry is often too complex to have a compact description, defeating the purpose of image compression. Instead, we adopt a bintree-structured segmentation scheme. The bintree is a binary tree created by recursive rectilinear bipartition of an image.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133700788","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513878
L. Pitt
We present a survey of design problems and results that arise in the prediction and parameter estimation of stochastic partial differential equations (SPDES). The aim is to better understand some unavoidable errors that occur in the discretization of SPDEs, and available methods for minimizing these errors.
{"title":"Estimation and prediction for (mostly Gaussian) Markov fields in the continuum","authors":"L. Pitt","doi":"10.1109/WITS.1994.513878","DOIUrl":"https://doi.org/10.1109/WITS.1994.513878","url":null,"abstract":"We present a survey of design problems and results that arise in the prediction and parameter estimation of stochastic partial differential equations (SPDES). The aim is to better understand some unavoidable errors that occur in the discretization of SPDEs, and available methods for minimizing these errors.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"147 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123440398","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513910
J. Chapa, M. Raghuveer
A limitation to wavelet design is the inability to construct orthonormal wavelets that match or are "tuned" to a desired signal. This paper develops a technique for constructing an orthonormal wavelet that is optimized in the least squares sense, and whose associated scaling function generates an orthonormal multiresolution analysis (OMRA).
{"title":"Constructing wavelets from desired signal functions","authors":"J. Chapa, M. Raghuveer","doi":"10.1109/WITS.1994.513910","DOIUrl":"https://doi.org/10.1109/WITS.1994.513910","url":null,"abstract":"A limitation to wavelet design is the inability to construct orthonormal wavelets that match or are \"tuned\" to a desired signal. This paper develops a technique for constructing an orthonormal wavelet that is optimized in the least squares sense, and whose associated scaling function generates an orthonormal multiresolution analysis (OMRA).","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124697263","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513914
M. Foodeei, E. Dubois
The goal of this work is to analyze the advantages of the recently introduced entropy-constrained code-excited linear predictive (EC-CELP) quantization. The analysis is at low rates in comparison with other EC quantization schemes. Based on N-th order rate-distortion function (RDF), EC quantization theory, and empirical methods, RDF memory gain and empirical space-filling gain (dimensionality N) at low bit rates are defined and calculated. These gains categorize and help us analyze and compare the available coding gains for various EC coders for a given rate and delay (N).
{"title":"Quantization theory and EC-CELP advantages at low bit rates","authors":"M. Foodeei, E. Dubois","doi":"10.1109/WITS.1994.513914","DOIUrl":"https://doi.org/10.1109/WITS.1994.513914","url":null,"abstract":"The goal of this work is to analyze the advantages of the recently introduced entropy-constrained code-excited linear predictive (EC-CELP) quantization. The analysis is at low rates in comparison with other EC quantization schemes. Based on N-th order rate-distortion function (RDF), EC quantization theory, and empirical methods, RDF memory gain and empirical space-filling gain (dimensionality N) at low bit rates are defined and calculated. These gains categorize and help us analyze and compare the available coding gains for various EC coders for a given rate and delay (N).","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122528505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513856
B. S. Clarke, A. Barron
We determine the asymptotic minimax redundancy of universal data compression in a parametric setting and show that it corresponds to the use of Jeffreys prior. Statistically, this formulation of the coding problem can be interpreted in a prior selection context and in an estimation context.
{"title":"Jeffreys' prior yields the asymptotic minimax redundancy","authors":"B. S. Clarke, A. Barron","doi":"10.1109/WITS.1994.513856","DOIUrl":"https://doi.org/10.1109/WITS.1994.513856","url":null,"abstract":"We determine the asymptotic minimax redundancy of universal data compression in a parametric setting and show that it corresponds to the use of Jeffreys prior. Statistically, this formulation of the coding problem can be interpreted in a prior selection context and in an estimation context.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125828205","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513915
R. L. Fry
A model is proposed in which the neuron serves as an information channel. An application of the Shannon information measures of entropy and mutual information taken together in the context of the proposed model lead to the Hopfield neuron model with a conditionalized Hebbian learning rule and sigmoidal transfer characteristic.
{"title":"Neural processing of information","authors":"R. L. Fry","doi":"10.1109/WITS.1994.513915","DOIUrl":"https://doi.org/10.1109/WITS.1994.513915","url":null,"abstract":"A model is proposed in which the neuron serves as an information channel. An application of the Shannon information measures of entropy and mutual information taken together in the context of the proposed model lead to the Hopfield neuron model with a conditionalized Hebbian learning rule and sigmoidal transfer characteristic.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125940061","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513852
D. Geman
We consider a computational strategy for shape recognition based on choosing "tests" one at a time in order to remove as much uncertainty as possible about the true hypothesis. The approach is compared with other recognition paradigms in computer vision and illustrated by attempting to classify handwritten digits and track roads from satellite images.
{"title":"The entropy strategy for shape recognition","authors":"D. Geman","doi":"10.1109/WITS.1994.513852","DOIUrl":"https://doi.org/10.1109/WITS.1994.513852","url":null,"abstract":"We consider a computational strategy for shape recognition based on choosing \"tests\" one at a time in order to remove as much uncertainty as possible about the true hypothesis. The approach is compared with other recognition paradigms in computer vision and illustrated by attempting to classify handwritten digits and track roads from satellite images.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128420800","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513918
D. Lake
Detecting minefields in the presence of clutter is an important challenge for the Navy. Mine-fields have point patterns that tend to be regular for a variety of reasons including strategic doctrine, safety, tactical efficiency, and perhaps most intriguing the human element. For example, humans have a tendency to make lottery number selections, a one-dimensional discrete point process, in a non-uniform manner. In this paper, we introduce several simple procedures to detect regularity in point processes.
{"title":"Detecting regularity in point processes generated by humans","authors":"D. Lake","doi":"10.1109/WITS.1994.513918","DOIUrl":"https://doi.org/10.1109/WITS.1994.513918","url":null,"abstract":"Detecting minefields in the presence of clutter is an important challenge for the Navy. Mine-fields have point patterns that tend to be regular for a variety of reasons including strategic doctrine, safety, tactical efficiency, and perhaps most intriguing the human element. For example, humans have a tendency to make lottery number selections, a one-dimensional discrete point process, in a non-uniform manner. In this paper, we introduce several simple procedures to detect regularity in point processes.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132717656","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513870
M. Burnashev, S. Verdú
A geometrical approach to ID-codes, based on their equivalence to some natural notions from mathematical statistics is described. That not only enlarges the available analytical apparatus, but also enables us to strengthen some known results.
{"title":"Testing of composite hypotheses and ID-codes","authors":"M. Burnashev, S. Verdú","doi":"10.1109/WITS.1994.513870","DOIUrl":"https://doi.org/10.1109/WITS.1994.513870","url":null,"abstract":"A geometrical approach to ID-codes, based on their equivalence to some natural notions from mathematical statistics is described. That not only enlarges the available analytical apparatus, but also enables us to strengthen some known results.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133337150","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}