{"title":"模拟源信息测度的通用估计","authors":"Qing Wang, Sanjeev R. Kulkarni, Sergio Verdú","doi":"10.1561/0100000021","DOIUrl":null,"url":null,"abstract":"This monograph presents an overview of universal estimation of information measures for continuous-alphabet sources. Special attention is given to the estimation of mutual information and divergence based on independent and identically distributed (i.i.d.) data. Plug-in methods, partitioning-based algorithms, nearest-neighbor algorithms as well as other approaches are reviewed, with particular focus on consistency, speed of convergence and experimental performance.","PeriodicalId":45236,"journal":{"name":"Foundations and Trends in Communications and Information Theory","volume":"14 1","pages":"265-353"},"PeriodicalIF":2.0000,"publicationDate":"2009-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"56","resultStr":"{\"title\":\"Universal Estimation of Information Measures for Analog Sources\",\"authors\":\"Qing Wang, Sanjeev R. Kulkarni, Sergio Verdú\",\"doi\":\"10.1561/0100000021\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This monograph presents an overview of universal estimation of information measures for continuous-alphabet sources. Special attention is given to the estimation of mutual information and divergence based on independent and identically distributed (i.i.d.) data. Plug-in methods, partitioning-based algorithms, nearest-neighbor algorithms as well as other approaches are reviewed, with particular focus on consistency, speed of convergence and experimental performance.\",\"PeriodicalId\":45236,\"journal\":{\"name\":\"Foundations and Trends in Communications and Information Theory\",\"volume\":\"14 1\",\"pages\":\"265-353\"},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2009-05-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"56\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Foundations and Trends in Communications and Information Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1561/0100000021\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Foundations and Trends in Communications and Information Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1561/0100000021","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Universal Estimation of Information Measures for Analog Sources
This monograph presents an overview of universal estimation of information measures for continuous-alphabet sources. Special attention is given to the estimation of mutual information and divergence based on independent and identically distributed (i.i.d.) data. Plug-in methods, partitioning-based algorithms, nearest-neighbor algorithms as well as other approaches are reviewed, with particular focus on consistency, speed of convergence and experimental performance.
期刊介绍:
Foundations and Trends® in Communications and Information Theory publishes survey and tutorial articles in the following topics: - Coded modulation - Coding theory and practice - Communication complexity - Communication system design - Cryptology and data security - Data compression - Data networks - Demodulation and Equalization - Denoising - Detection and estimation - Information theory and statistics - Information theory and computer science - Joint source/channel coding - Modulation and signal design - Multiuser detection - Multiuser information theory