{"title":"A comparison of information functions and search strategies for sensor planning in target classification.","authors":"Guoxian Zhang, Silvia Ferrari, Chenghui Cai","doi":"10.1109/TSMCB.2011.2165336","DOIUrl":null,"url":null,"abstract":"<p><p>This paper investigates the comparative performance of several information-driven search strategies and decision rules using a canonical target classification problem. Five sensor models are considered: one obtained from classical estimation theory and four obtained from Bernoulli, Poisson, binomial, and mixture-of-binomial distributions. A systematic approach is presented for deriving information functions that represent the expected utility of future sensor measurements from mutual information, Rènyi divergence, Kullback-Leibler divergence, information potential, quadratic entropy, and the Cauchy-Schwarz distance. The resulting information-driven strategies are compared to direct-search, alert-confirm, task-driven (TS), and log-likelihood-ratio (LLR) search strategies. Extensive numerical simulations show that quadratic entropy typically leads to the most effective search strategy with respect to correct-classification rates. In the presence of prior information, the quadratic-entropy-driven strategy also displays the lowest rate of false alarms. However, when prior information is absent or very noisy, TS and LLR strategies achieve the lowest false-alarm rates for the Bernoulli, mixture-of-binomial, and classical sensor models.</p>","PeriodicalId":55006,"journal":{"name":"IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics","volume":" ","pages":"2-16"},"PeriodicalIF":0.0000,"publicationDate":"2012-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/TSMCB.2011.2165336","citationCount":"33","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TSMCB.2011.2165336","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2011/10/31 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 33
Abstract
This paper investigates the comparative performance of several information-driven search strategies and decision rules using a canonical target classification problem. Five sensor models are considered: one obtained from classical estimation theory and four obtained from Bernoulli, Poisson, binomial, and mixture-of-binomial distributions. A systematic approach is presented for deriving information functions that represent the expected utility of future sensor measurements from mutual information, Rènyi divergence, Kullback-Leibler divergence, information potential, quadratic entropy, and the Cauchy-Schwarz distance. The resulting information-driven strategies are compared to direct-search, alert-confirm, task-driven (TS), and log-likelihood-ratio (LLR) search strategies. Extensive numerical simulations show that quadratic entropy typically leads to the most effective search strategy with respect to correct-classification rates. In the presence of prior information, the quadratic-entropy-driven strategy also displays the lowest rate of false alarms. However, when prior information is absent or very noisy, TS and LLR strategies achieve the lowest false-alarm rates for the Bernoulli, mixture-of-binomial, and classical sensor models.