{"title":"Channel equalization with perceptrons: an information-theoretic approach","authors":"T. Adalı, M. Sönmez","doi":"10.1109/ICASSP.1994.390039","DOIUrl":null,"url":null,"abstract":"We formulate the adaptive channel equalization as a conditional probability distribution learning problem. Conditional probability density function of the transmitted signal given the received signal is parametrized by a sigmoidal perceptron. In this framework, we use relative entropy (Kullback-Leibler distance) between the true and the estimated distributions as the cost function to be minimized. The true probabilities are approximated by their stochastic estimators resulting in a stochastic relative entropy cost function. This function is well-formed in the sense of Wittner and Denker (1988), therefore gradient descent on this cost function is guaranteed to find a solution. The consistency and asymptotic normality of this learning scheme are shown via maximum partial likelihood estimation of logistic models. As a practical example, we demonstrate that the resulting algorithm successfully equalizes multipath channels.<<ETX>>","PeriodicalId":290798,"journal":{"name":"Proceedings of ICASSP '94. IEEE International Conference on Acoustics, Speech and Signal Processing","volume":"9923 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of ICASSP '94. IEEE International Conference on Acoustics, Speech and Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSP.1994.390039","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
We formulate the adaptive channel equalization as a conditional probability distribution learning problem. Conditional probability density function of the transmitted signal given the received signal is parametrized by a sigmoidal perceptron. In this framework, we use relative entropy (Kullback-Leibler distance) between the true and the estimated distributions as the cost function to be minimized. The true probabilities are approximated by their stochastic estimators resulting in a stochastic relative entropy cost function. This function is well-formed in the sense of Wittner and Denker (1988), therefore gradient descent on this cost function is guaranteed to find a solution. The consistency and asymptotic normality of this learning scheme are shown via maximum partial likelihood estimation of logistic models. As a practical example, we demonstrate that the resulting algorithm successfully equalizes multipath channels.<>