{"title":"误差测量参数平均码长的确定","authors":"Arif Habib","doi":"10.15406/BBIJ.2018.07.00219","DOIUrl":null,"url":null,"abstract":"From past three decades, entropy which is branch of statistical sciences has been used to determine the degree of variability, describes how uncertainty should be quantified in a skillful manner for representation. Statistical entropy has some conflicting explanations so that sometimes it measures two complementary conceptions like information and lack of information. Claude Shannon through two outstanding contributions in 1948 and 1949 relates it with positive information. These were followed by a flood of research papers hypothesize upon the possible applications in almost every field such as pure mathematics, semantics, physics, management, thermodynamics, botany, econometrics, operations research, psychology, epidemiological studies, disease management and related disciplines. Information theory has also had an important role in shaping theories of perception, cognition, and neural computation. When the message is readily measurable, we can say that the information is the reduction of uncertainty. But we usually encountered lossy information i.e a part of the transmitted information reaches the destination in a distorted form. In statistical theory of information, certain specialized terms which need to be translated into a measurable form. A source is similar to the space of a random experiment. A finite sequence of characters is called a word in the same way that the sequence of a number of outcomes associated with the repetition of an experiment may be designated as an event. An interesting observation can be made about the entropy of a binary source. Binary coding offers an interesting practical opportunity for encoding.","PeriodicalId":90455,"journal":{"name":"Biometrics & biostatistics international journal","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2018-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Determination of parametric average code length of inaccuracy measure\",\"authors\":\"Arif Habib\",\"doi\":\"10.15406/BBIJ.2018.07.00219\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"From past three decades, entropy which is branch of statistical sciences has been used to determine the degree of variability, describes how uncertainty should be quantified in a skillful manner for representation. Statistical entropy has some conflicting explanations so that sometimes it measures two complementary conceptions like information and lack of information. Claude Shannon through two outstanding contributions in 1948 and 1949 relates it with positive information. These were followed by a flood of research papers hypothesize upon the possible applications in almost every field such as pure mathematics, semantics, physics, management, thermodynamics, botany, econometrics, operations research, psychology, epidemiological studies, disease management and related disciplines. Information theory has also had an important role in shaping theories of perception, cognition, and neural computation. When the message is readily measurable, we can say that the information is the reduction of uncertainty. But we usually encountered lossy information i.e a part of the transmitted information reaches the destination in a distorted form. In statistical theory of information, certain specialized terms which need to be translated into a measurable form. A source is similar to the space of a random experiment. A finite sequence of characters is called a word in the same way that the sequence of a number of outcomes associated with the repetition of an experiment may be designated as an event. An interesting observation can be made about the entropy of a binary source. Binary coding offers an interesting practical opportunity for encoding.\",\"PeriodicalId\":90455,\"journal\":{\"name\":\"Biometrics & biostatistics international journal\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-07-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Biometrics & biostatistics international journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.15406/BBIJ.2018.07.00219\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biometrics & biostatistics international journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.15406/BBIJ.2018.07.00219","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Determination of parametric average code length of inaccuracy measure
From past three decades, entropy which is branch of statistical sciences has been used to determine the degree of variability, describes how uncertainty should be quantified in a skillful manner for representation. Statistical entropy has some conflicting explanations so that sometimes it measures two complementary conceptions like information and lack of information. Claude Shannon through two outstanding contributions in 1948 and 1949 relates it with positive information. These were followed by a flood of research papers hypothesize upon the possible applications in almost every field such as pure mathematics, semantics, physics, management, thermodynamics, botany, econometrics, operations research, psychology, epidemiological studies, disease management and related disciplines. Information theory has also had an important role in shaping theories of perception, cognition, and neural computation. When the message is readily measurable, we can say that the information is the reduction of uncertainty. But we usually encountered lossy information i.e a part of the transmitted information reaches the destination in a distorted form. In statistical theory of information, certain specialized terms which need to be translated into a measurable form. A source is similar to the space of a random experiment. A finite sequence of characters is called a word in the same way that the sequence of a number of outcomes associated with the repetition of an experiment may be designated as an event. An interesting observation can be made about the entropy of a binary source. Binary coding offers an interesting practical opportunity for encoding.