R. Gaudel, F. Bonnet, J.B. Domelevo-Entfellner, A. Roumy
{"title":"Noise Variance Estimation in DS-CDMA and its Effects on the Individually Optimum Receiver","authors":"R. Gaudel, F. Bonnet, J.B. Domelevo-Entfellner, A. Roumy","doi":"10.1109/SPAWC.2006.346355","DOIUrl":null,"url":null,"abstract":"In the context of synchronous random DS-CDMA (direct sequence code division multiple access) communications over a mobile network, the receiver that minimizes the per-user bit error rate (BER) is the symbol maximum a posteriori (MAP) detector. This receiver is derived under the hypothesis of perfect channel state information at the receiver. In this paper we consider the case where the channel noise variance is estimated and analyze the effect of this mismatch. We show that the bit error rate (BER) is piecewise monotonic wrt. the estimated noise variance, reaching its minimum for the true channel variance. We also provide an upper bound of the individually optimum receiver performance under noise variance mismatch. Thus we give a theoretical justification for the usual bias towards noise variance underestimation adopted by the community","PeriodicalId":414942,"journal":{"name":"2006 IEEE 7th Workshop on Signal Processing Advances in Wireless Communications","volume":"325 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 IEEE 7th Workshop on Signal Processing Advances in Wireless Communications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPAWC.2006.346355","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
In the context of synchronous random DS-CDMA (direct sequence code division multiple access) communications over a mobile network, the receiver that minimizes the per-user bit error rate (BER) is the symbol maximum a posteriori (MAP) detector. This receiver is derived under the hypothesis of perfect channel state information at the receiver. In this paper we consider the case where the channel noise variance is estimated and analyze the effect of this mismatch. We show that the bit error rate (BER) is piecewise monotonic wrt. the estimated noise variance, reaching its minimum for the true channel variance. We also provide an upper bound of the individually optimum receiver performance under noise variance mismatch. Thus we give a theoretical justification for the usual bias towards noise variance underestimation adopted by the community