{"title":"信息论信号处理及其应用[书架]","authors":"Alexander Haimovich","doi":"10.1109/MCS.2023.3234387","DOIUrl":null,"url":null,"abstract":"The roots of information theory are almost 100 years old and include early works by Fisher <xref ref-type=\"bibr\" rid=\"ref1\">[1]</xref>, Hartley <xref ref-type=\"bibr\" rid=\"ref2\">[2]</xref>, and others. According to a history of information theory <xref ref-type=\"bibr\" rid=\"ref3\">[3]</xref>, motivated to understand how to draw information from experiments, Fisher <xref ref-type=\"bibr\" rid=\"ref1\">[1]</xref> stated “the nature and degree of the uncertainty [must] be capable of rigorous expression.” Subsequently, he defined statistical information as the reciprocal of the variance of a statistical sample. However, it was Shannon’s work <xref ref-type=\"bibr\" rid=\"ref4\">[4]</xref> that laid the mathematical foundations of information theory and revolutionized communications. Shannon developed two fundamental bounds, one on data compression and the other on transmission rate. He proved that even in the presence of noise, an arbitrarily small probability of error may be achieved as long as the transmission rate is below a quantity he defined as channel capacity.","PeriodicalId":55028,"journal":{"name":"IEEE Control Systems Magazine","volume":"43 1","pages":"97-109"},"PeriodicalIF":3.9000,"publicationDate":"2023-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Information Theoretic Signal Processing and Its Applications [Bookshelf]\",\"authors\":\"Alexander Haimovich\",\"doi\":\"10.1109/MCS.2023.3234387\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The roots of information theory are almost 100 years old and include early works by Fisher <xref ref-type=\\\"bibr\\\" rid=\\\"ref1\\\">[1]</xref>, Hartley <xref ref-type=\\\"bibr\\\" rid=\\\"ref2\\\">[2]</xref>, and others. According to a history of information theory <xref ref-type=\\\"bibr\\\" rid=\\\"ref3\\\">[3]</xref>, motivated to understand how to draw information from experiments, Fisher <xref ref-type=\\\"bibr\\\" rid=\\\"ref1\\\">[1]</xref> stated “the nature and degree of the uncertainty [must] be capable of rigorous expression.” Subsequently, he defined statistical information as the reciprocal of the variance of a statistical sample. However, it was Shannon’s work <xref ref-type=\\\"bibr\\\" rid=\\\"ref4\\\">[4]</xref> that laid the mathematical foundations of information theory and revolutionized communications. Shannon developed two fundamental bounds, one on data compression and the other on transmission rate. He proved that even in the presence of noise, an arbitrarily small probability of error may be achieved as long as the transmission rate is below a quantity he defined as channel capacity.\",\"PeriodicalId\":55028,\"journal\":{\"name\":\"IEEE Control Systems Magazine\",\"volume\":\"43 1\",\"pages\":\"97-109\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2023-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Control Systems Magazine\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1109/MCS.2023.3234387\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Control Systems Magazine","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/MCS.2023.3234387","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
Information Theoretic Signal Processing and Its Applications [Bookshelf]
The roots of information theory are almost 100 years old and include early works by Fisher [1], Hartley [2], and others. According to a history of information theory [3], motivated to understand how to draw information from experiments, Fisher [1] stated “the nature and degree of the uncertainty [must] be capable of rigorous expression.” Subsequently, he defined statistical information as the reciprocal of the variance of a statistical sample. However, it was Shannon’s work [4] that laid the mathematical foundations of information theory and revolutionized communications. Shannon developed two fundamental bounds, one on data compression and the other on transmission rate. He proved that even in the presence of noise, an arbitrarily small probability of error may be achieved as long as the transmission rate is below a quantity he defined as channel capacity.
期刊介绍:
As the official means of communication for the IEEE Control Systems Society, the IEEE Control Systems Magazine publishes interesting, useful, and informative material on all aspects of control system technology for the benefit of control educators, practitioners, and researchers.