Pub Date : 2006-03-13DOI: 10.1109/ITW.2006.1633831
Chris T. K. Ng, J. N. Laneman, A. Goldsmith
We compare the rate of a multiple-antenna relay channel to the capacity of multiple-antenna systems to characterize the cooperative capacity in different SNR regions. While it is known that in the asymptotic regime, at a high SNR or with a large number of cooperating nodes, cooperative systems lack full multiplexing gain, in this paper we consider cooperative capacity gain at moderate SNR with a fixed number of cooperating antennas. We show that up to a lower bound to an SNR threshold, a cooperative system performs at least as well as a MIMO system with isotropic inputs; whereas beyond an upper bound to the SNR threshold, the cooperative system is limited by its coordination costs, and the capacity is strictly less than that of a MIMO orthogonal channel. The SNR threshold depends on the network geometry (the power gain g between the source and relay) and the number of cooperating antennas M; when the relay is close to the source (g [unk] 1), the SNR threshold lower and upper bounds are approximately equal. As the cooperating nodes are closer, i.e., as g increases, the MIMO-gain region extends to a higher SNR. Whereas for a populous cluster, i.e., when M is large, the coordination-limited region sets in at a lower SNR.
{"title":"The Role of SNR in Achieving MIMO Rates in Cooperative Systems","authors":"Chris T. K. Ng, J. N. Laneman, A. Goldsmith","doi":"10.1109/ITW.2006.1633831","DOIUrl":"https://doi.org/10.1109/ITW.2006.1633831","url":null,"abstract":"We compare the rate of a multiple-antenna relay channel to the capacity of multiple-antenna systems to characterize the cooperative capacity in different SNR regions. While it is known that in the asymptotic regime, at a high SNR or with a large number of cooperating nodes, cooperative systems lack full multiplexing gain, in this paper we consider cooperative capacity gain at moderate SNR with a fixed number of cooperating antennas. We show that up to a lower bound to an SNR threshold, a cooperative system performs at least as well as a MIMO system with isotropic inputs; whereas beyond an upper bound to the SNR threshold, the cooperative system is limited by its coordination costs, and the capacity is strictly less than that of a MIMO orthogonal channel. The SNR threshold depends on the network geometry (the power gain g between the source and relay) and the number of cooperating antennas M; when the relay is close to the source (g [unk] 1), the SNR threshold lower and upper bounds are approximately equal. As the cooperating nodes are closer, i.e., as g increases, the MIMO-gain region extends to a higher SNR. Whereas for a populous cluster, i.e., when M is large, the coordination-limited region sets in at a lower SNR.","PeriodicalId":293144,"journal":{"name":"2006 IEEE Information Theory Workshop - ITW '06 Punta del Este","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124947544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-03-13DOI: 10.1109/ITW.2006.1633798
J. O’Sullivan, N. Singla, M. Westover
In this paper we examine the achievable rate region for the problem of successive refinement of information for pattern recognition systems. The pattern recognition system has two stages, going from coarse to fine recognition as more resources become available for storing internal representations of the patterns. We present an inner and an outer bound on the true achievable rate region. Using these results we derive conditions under which a pattern recognition system is successively refinable. These conditions are similar to the Markov condition for successive refinement in the rate-distortion problem.
{"title":"Successive Refinement for Pattern Recognition","authors":"J. O’Sullivan, N. Singla, M. Westover","doi":"10.1109/ITW.2006.1633798","DOIUrl":"https://doi.org/10.1109/ITW.2006.1633798","url":null,"abstract":"In this paper we examine the achievable rate region for the problem of successive refinement of information for pattern recognition systems. The pattern recognition system has two stages, going from coarse to fine recognition as more resources become available for storing internal representations of the patterns. We present an inner and an outer bound on the true achievable rate region. Using these results we derive conditions under which a pattern recognition system is successively refinable. These conditions are similar to the Markov condition for successive refinement in the rate-distortion problem.","PeriodicalId":293144,"journal":{"name":"2006 IEEE Information Theory Workshop - ITW '06 Punta del Este","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116970744","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-03-13DOI: 10.1109/ITW.2006.1633841
Omolabake A. Adenle, W. Fitzgerald
We present a stochastic algorithm for Independent Factor Analysis, incorporating a scheme for performing model selection over latent data. Independent Factor Analysis (IFA) is a method for learing locally non-linear subspaces in data. IFA uses a hierarchical generative model with factors modeled as independent Mixtures of Gaussians(MoGs), each mixture component representing a factor state. We incorporate Birth-Death MCMC (BDMCMC) to simulate samples from the posterior distribution of the factor model, with a Gibbs Sampler simulating from the posterior over model parameters. In spite of the common practice of using a fixed number of mixture components to model factors, it may be difficult to blindly determine an optimal minimal number of components without prior knowledge of the structure of the hidden data. Also, in pattern recognition applications where the source model order has an intrinsic interpretation, estimating this along with other model parameters would be useful. Our algorithm addresses both issues of model selection and parameter estimation.
{"title":"Bayesian Model Selection for Independent Factor Analysis","authors":"Omolabake A. Adenle, W. Fitzgerald","doi":"10.1109/ITW.2006.1633841","DOIUrl":"https://doi.org/10.1109/ITW.2006.1633841","url":null,"abstract":"We present a stochastic algorithm for Independent Factor Analysis, incorporating a scheme for performing model selection over latent data. Independent Factor Analysis (IFA) is a method for learing locally non-linear subspaces in data. IFA uses a hierarchical generative model with factors modeled as independent Mixtures of Gaussians(MoGs), each mixture component representing a factor state. We incorporate Birth-Death MCMC (BDMCMC) to simulate samples from the posterior distribution of the factor model, with a Gibbs Sampler simulating from the posterior over model parameters. In spite of the common practice of using a fixed number of mixture components to model factors, it may be difficult to blindly determine an optimal minimal number of components without prior knowledge of the structure of the hidden data. Also, in pattern recognition applications where the source model order has an intrinsic interpretation, estimating this along with other model parameters would be useful. Our algorithm addresses both issues of model selection and parameter estimation.","PeriodicalId":293144,"journal":{"name":"2006 IEEE Information Theory Workshop - ITW '06 Punta del Este","volume":"631 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132961198","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-02-25DOI: 10.1109/ITW.2006.1633805
P. Vontobel, R. Koetter
Whereas many results are known about thresholds for ensembles of low-density parity-check codes under message-passing iterative decoding, this is not the case for linear programming decoding. Towards closing this knowledge gap, this paper presents some bounds on the thresholds of low-density parity-check code ensembles under linear programming decoding.
{"title":"Bounds on the Threshold of Linear Programming Decoding","authors":"P. Vontobel, R. Koetter","doi":"10.1109/ITW.2006.1633805","DOIUrl":"https://doi.org/10.1109/ITW.2006.1633805","url":null,"abstract":"Whereas many results are known about thresholds for ensembles of low-density parity-check codes under message-passing iterative decoding, this is not the case for linear programming decoding. Towards closing this knowledge gap, this paper presents some bounds on the thresholds of low-density parity-check code ensembles under linear programming decoding.","PeriodicalId":293144,"journal":{"name":"2006 IEEE Information Theory Workshop - ITW '06 Punta del Este","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133336081","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-02-07DOI: 10.1109/ITW.2006.1633802
A. Montanari, David Tse
We consider the CDMA (code-division multiple-access) multi-user detection problem for binary signals and additive white gaussian noise. We propose a spreading sequences scheme based on random sparse signatures, and a detection algorithm based on belief propagation (BP) with linear time complexity. In the new scheme, each user conveys its power onto a finite number of chips l̄, in the large system limit. We analyze the performances of BP detection and prove that they coincide with the ones of optimal (symbol MAP) detection in the l̄ → ∞ limit. In the same limit, we prove that the information capacity of the system converges to Tanaka's formula for random 'dense' signatures, thus providing the first rigorous justification of this formula. Apart from being computationally convenient, the new scheme allows for optimization in close analogy with irregular low density parity check code ensembles.
{"title":"Analysis of Belief Propagation for Non-Linear Problems: The Example of CDMA (or: How to Prove Tanaka's Formula)","authors":"A. Montanari, David Tse","doi":"10.1109/ITW.2006.1633802","DOIUrl":"https://doi.org/10.1109/ITW.2006.1633802","url":null,"abstract":"We consider the CDMA (code-division multiple-access) multi-user detection problem for binary signals and additive white gaussian noise. We propose a spreading sequences scheme based on random sparse signatures, and a detection algorithm based on belief propagation (BP) with linear time complexity. In the new scheme, each user conveys its power onto a finite number of chips l̄, in the large system limit. We analyze the performances of BP detection and prove that they coincide with the ones of optimal (symbol MAP) detection in the l̄ → ∞ limit. In the same limit, we prove that the information capacity of the system converges to Tanaka's formula for random 'dense' signatures, thus providing the first rigorous justification of this formula. Apart from being computationally convenient, the new scheme allows for optimization in close analogy with irregular low density parity check code ensembles.","PeriodicalId":293144,"journal":{"name":"2006 IEEE Information Theory Workshop - ITW '06 Punta del Este","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-02-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132166833","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-01-23DOI: 10.1109/ITW.2006.1633774
N. Gisin, S. Popescu, V. Scarani, S. Wolf, Jürg Wullschleger
We show that oblivious transfer can be seen as the classical analogue to a quantum channel in the same sense as non-local boxes are for maximally entangled qubits.
我们表明,遗忘转移可以被视为量子信道的经典模拟,就像最大纠缠量子位的非局部盒一样。
{"title":"Oblivious Transfer and Quantum Channels","authors":"N. Gisin, S. Popescu, V. Scarani, S. Wolf, Jürg Wullschleger","doi":"10.1109/ITW.2006.1633774","DOIUrl":"https://doi.org/10.1109/ITW.2006.1633774","url":null,"abstract":"We show that oblivious transfer can be seen as the classical analogue to a quantum channel in the same sense as non-local boxes are for maximally entangled qubits.","PeriodicalId":293144,"journal":{"name":"2006 IEEE Information Theory Workshop - ITW '06 Punta del Este","volume":"212 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-01-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122671211","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-01-20DOI: 10.1109/ITW.2006.1633840
Joel B. Predd, S. Kulkarni, H. Poor
This paper addresses the problem of distributed learning under communication constraints, motivated by distributed signal processing in wireless sensor networks and data mining with distributed databases. After formalizing a general model for distributed learning, an algorithm for collaboratively training regularized kernel least-squares regression estimators is derived. Noting that the algorithm can be viewed as an application of successive orthogonal projection algorithms, its convergence properties are investigated and the statistical behavior of the estimator is discussed in a simplified theoretical setting.
{"title":"Distributed Kernel Regression: An Algorithm for Training Collaboratively","authors":"Joel B. Predd, S. Kulkarni, H. Poor","doi":"10.1109/ITW.2006.1633840","DOIUrl":"https://doi.org/10.1109/ITW.2006.1633840","url":null,"abstract":"This paper addresses the problem of distributed learning under communication constraints, motivated by distributed signal processing in wireless sensor networks and data mining with distributed databases. After formalizing a general model for distributed learning, an algorithm for collaboratively training regularized kernel least-squares regression estimators is derived. Noting that the algorithm can be viewed as an application of successive orthogonal projection algorithms, its convergence properties are investigated and the statistical behavior of the estimator is discussed in a simplified theoretical setting.","PeriodicalId":293144,"journal":{"name":"2006 IEEE Information Theory Workshop - ITW '06 Punta del Este","volume":"56 8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124475195","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}