Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513879
J. Rosenthal
We briefly describe Markov chain Monte Carlo algorithms, such as the Gibbs sampler and the Metropolis-Hastings (1953, 1970) algorithm, which are frequently used in the statistics literature to explore complicated probability distributions. We present a general method for proving rigorous, a priori bounds an the number of iterations required to achieve convergence of the algorithms.
{"title":"Markov chain Monte Carlo algorithms","authors":"J. Rosenthal","doi":"10.1109/WITS.1994.513879","DOIUrl":"https://doi.org/10.1109/WITS.1994.513879","url":null,"abstract":"We briefly describe Markov chain Monte Carlo algorithms, such as the Gibbs sampler and the Metropolis-Hastings (1953, 1970) algorithm, which are frequently used in the statistics literature to explore complicated probability distributions. We present a general method for proving rigorous, a priori bounds an the number of iterations required to achieve convergence of the algorithms.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125000091","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513895
R. Krichevskii, M.P. Scharova
We find a formula for Shannon's-Hartley's entropy Ratio of a text governed by the Zipf law. The formula is in a good agreement with real texts. It is asymptotically 1/2+loglo.
{"title":"Shannon-Hartley entropy ratio under Zipf law","authors":"R. Krichevskii, M.P. Scharova","doi":"10.1109/WITS.1994.513895","DOIUrl":"https://doi.org/10.1109/WITS.1994.513895","url":null,"abstract":"We find a formula for Shannon's-Hartley's entropy Ratio of a text governed by the Zipf law. The formula is in a good agreement with real texts. It is asymptotically 1/2+loglo.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116023824","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513854
N. Merhav, M. Feder
The capacity of the channel induced by a given class of sources is well known to be an attainable lower bound on the redundancy of universal codes w.r.t this class, both in the minimax sense and in the Bayesian (maximin) sense. We show that this capacity is essentially a lower bound also in a stronger sense, that is, for "most" sources in the class. This result extends Rissanen's lower bound for parametric families. We demonstrate its applicability in several examples and discuss its implications in statistical inference.
{"title":"A stronger version of the redundancy-capacity theorem of universal coding","authors":"N. Merhav, M. Feder","doi":"10.1109/WITS.1994.513854","DOIUrl":"https://doi.org/10.1109/WITS.1994.513854","url":null,"abstract":"The capacity of the channel induced by a given class of sources is well known to be an attainable lower bound on the redundancy of universal codes w.r.t this class, both in the minimax sense and in the Bayesian (maximin) sense. We show that this capacity is essentially a lower bound also in a stronger sense, that is, for \"most\" sources in the class. This result extends Rissanen's lower bound for parametric families. We demonstrate its applicability in several examples and discuss its implications in statistical inference.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122933860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513917
S. Krishnamachari, Ramalingam Chellappa
A multiresolution model for Gauss Markov random fields (GMRF) is presented. Based on information theoretic measures, techniques are presented to estimate the GMRF parameters of a process at coarser resolutions from the parameters at fine resolution.
{"title":"Modeling Gauss Markov random fields at multiple resolutions","authors":"S. Krishnamachari, Ramalingam Chellappa","doi":"10.1109/WITS.1994.513917","DOIUrl":"https://doi.org/10.1109/WITS.1994.513917","url":null,"abstract":"A multiresolution model for Gauss Markov random fields (GMRF) is presented. Based on information theoretic measures, techniques are presented to estimate the GMRF parameters of a process at coarser resolutions from the parameters at fine resolution.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122478087","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513868
Y. Steinberg, S. Verdú
Random number generators are important devices in randomized algorithms, Monte-Carlo methods, and in simulation studies of random systems. A random number generator is usually modeled as a random source emitting independent, equally likely random bits. In practice, the random source one has at hand can deviate from this idealized model, and the random number generator operates by applying a deterministic mapping on the output of the (nonideal) random source. The deterministic mapping is chosen so that the resulting process approximates, in some sense, a sequence of independent, equally likely random bits. A prime measure of the intrinsic randomness of a given source X is the maximal rate at which random bits can be extracted from X by suitably mapping its output. This maximal rate depends on the statistics of the source X and on the sense of approximation. We study the problem of finite-precision random bit generation, where the accuracy measure is the variational distance. The relevant information theory is addressed.
{"title":"Finite-precision intrinsic randomness and source resolvability","authors":"Y. Steinberg, S. Verdú","doi":"10.1109/WITS.1994.513868","DOIUrl":"https://doi.org/10.1109/WITS.1994.513868","url":null,"abstract":"Random number generators are important devices in randomized algorithms, Monte-Carlo methods, and in simulation studies of random systems. A random number generator is usually modeled as a random source emitting independent, equally likely random bits. In practice, the random source one has at hand can deviate from this idealized model, and the random number generator operates by applying a deterministic mapping on the output of the (nonideal) random source. The deterministic mapping is chosen so that the resulting process approximates, in some sense, a sequence of independent, equally likely random bits. A prime measure of the intrinsic randomness of a given source X is the maximal rate at which random bits can be extracted from X by suitably mapping its output. This maximal rate depends on the statistics of the source X and on the sense of approximation. We study the problem of finite-precision random bit generation, where the accuracy measure is the variational distance. The relevant information theory is addressed.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123366433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513865
R. Lipton
Summary form only given. The author proves a constructive version of Shannon's fundamental theorem of information theory. The new theorem holds for any feasible channel. A channel is feasible provided it is computable by a polynomial time computation.
{"title":"Coding for noisy feasible channels","authors":"R. Lipton","doi":"10.1109/WITS.1994.513865","DOIUrl":"https://doi.org/10.1109/WITS.1994.513865","url":null,"abstract":"Summary form only given. The author proves a constructive version of Shannon's fundamental theorem of information theory. The new theorem holds for any feasible channel. A channel is feasible provided it is computable by a polynomial time computation.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114711489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513851
A. Dembo
Large deviations theory, a branch of probability theory that deals with estimates of probabilities of very rare events has close links with topics in information theory and in statistics. Some of these connections are explored.
{"title":"Large deviations in information theory and statistics","authors":"A. Dembo","doi":"10.1109/WITS.1994.513851","DOIUrl":"https://doi.org/10.1109/WITS.1994.513851","url":null,"abstract":"Large deviations theory, a branch of probability theory that deals with estimates of probabilities of very rare events has close links with topics in information theory and in statistics. Some of these connections are explored.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121777646","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513928
Yang Wang, T. Adalı, T. Lei
We derive two types of block-wise FNM model for pixel images by incorporating local context. The self-learning is then formulated as an information match problem and solved by first estimating model parameters to initialize ML solution and then conducting finer segmentation through MRF relaxation.
{"title":"Unsupervised medical image analysis by multiscale FNM modeling and MRF relaxation labeling","authors":"Yang Wang, T. Adalı, T. Lei","doi":"10.1109/WITS.1994.513928","DOIUrl":"https://doi.org/10.1109/WITS.1994.513928","url":null,"abstract":"We derive two types of block-wise FNM model for pixel images by incorporating local context. The self-learning is then formulated as an information match problem and solved by first estimating model parameters to initialize ML solution and then conducting finer segmentation through MRF relaxation.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130824674","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513912
Wang Duanyi, Hu Zhengming
An adaptive Markov model with three states for the mobile communication channel is studied and simulated. The error sequence describing the long burst error characteristics of the model channel is generated on a computer. A test method using a threshold technique is given to verify the accuracy of the adaptive channel model.
{"title":"Application of Markov model in mobile communication channel","authors":"Wang Duanyi, Hu Zhengming","doi":"10.1109/WITS.1994.513912","DOIUrl":"https://doi.org/10.1109/WITS.1994.513912","url":null,"abstract":"An adaptive Markov model with three states for the mobile communication channel is studied and simulated. The error sequence describing the long burst error characteristics of the model channel is generated on a computer. A test method using a threshold technique is given to verify the accuracy of the adaptive channel model.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"229 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116443810","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-10-27DOI: 10.1109/WITS.1994.513891
M. Figueiredo, J. Leitão
Edge location in compound Gauss-Markov random fields (CGMRF) is formulated as a parameter estimation problem. Since the number of parameters is unknown, a minimum-description-length (MDL) criterion is proposed for image restoration based on the CGMRF model.
{"title":"Adaptive edge detection in compound Gauss-Markov random fields using the minimum description length principle","authors":"M. Figueiredo, J. Leitão","doi":"10.1109/WITS.1994.513891","DOIUrl":"https://doi.org/10.1109/WITS.1994.513891","url":null,"abstract":"Edge location in compound Gauss-Markov random fields (CGMRF) is formulated as a parameter estimation problem. Since the number of parameters is unknown, a minimum-description-length (MDL) criterion is proposed for image restoration based on the CGMRF model.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126282616","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}