{"title":"Probabilistic model distortion measure and its application to model-set design of multiple model approach","authors":"Zhanlue Zhao, X.R. Li","doi":"10.1109/ACSSC.2004.1399546","DOIUrl":null,"url":null,"abstract":"In parameter estimation and filtering, model approximation is quite common in engineering research and development. These approximations distort the original relation between the parameter of interest and the observation and cause the performance deterioration. It is crucial to have a measure to appraise these approximations. In this paper, we analyze the structure of the parameter inference and clarify its ingrained vagueness. Accordingly, we apprehend the commensuration between the model distortion and the difference between two probability density functions. We work out a distortion measure, and it turns out that the Kullback-Leibler (K-L) divergence can serve this purpose. We apply the K-L divergence as a distortion measure to model set design for multiple model estimation. We demonstrate that the K-L divergence is a measure of significance for estimation performance deterioration, and has high potential for the development of highly adaptive algorithms.","PeriodicalId":396779,"journal":{"name":"Conference Record of the Thirty-Eighth Asilomar Conference on Signals, Systems and Computers, 2004.","volume":"87 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference Record of the Thirty-Eighth Asilomar Conference on Signals, Systems and Computers, 2004.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACSSC.2004.1399546","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
In parameter estimation and filtering, model approximation is quite common in engineering research and development. These approximations distort the original relation between the parameter of interest and the observation and cause the performance deterioration. It is crucial to have a measure to appraise these approximations. In this paper, we analyze the structure of the parameter inference and clarify its ingrained vagueness. Accordingly, we apprehend the commensuration between the model distortion and the difference between two probability density functions. We work out a distortion measure, and it turns out that the Kullback-Leibler (K-L) divergence can serve this purpose. We apply the K-L divergence as a distortion measure to model set design for multiple model estimation. We demonstrate that the K-L divergence is a measure of significance for estimation performance deterioration, and has high potential for the development of highly adaptive algorithms.