The Journey from Entropy to Generalized Maximum Entropy

Amjad D. Al-Nasser
{"title":"The Journey from Entropy to Generalized Maximum Entropy","authors":"Amjad D. Al-Nasser","doi":"10.29145/2019/JQM/030101","DOIUrl":null,"url":null,"abstract":"Currently we are witnessing the revaluation of huge data recourses that should be analyzed carefully to draw the right decisions about the world problems. Such big data are statistically risky since we know that the data are combination of (useful) signals and (useless) noise, which considered as unorganized facts that need to be filtered and processed. Using the signals only and discarding the noise means that the data restructured and reorganized to be useful and it is called information. So for any data set, we need only the information. In context of information theory, the entropy is used as a statistical measure to quantify the maximum amount of information in a random event.","PeriodicalId":212366,"journal":{"name":"Journal of Quantitative Methods","volume":"408 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Quantitative Methods","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.29145/2019/JQM/030101","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Currently we are witnessing the revaluation of huge data recourses that should be analyzed carefully to draw the right decisions about the world problems. Such big data are statistically risky since we know that the data are combination of (useful) signals and (useless) noise, which considered as unorganized facts that need to be filtered and processed. Using the signals only and discarding the noise means that the data restructured and reorganized to be useful and it is called information. So for any data set, we need only the information. In context of information theory, the entropy is used as a statistical measure to quantify the maximum amount of information in a random event.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
从熵到广义最大熵的旅程
目前,我们正在目睹对大量数据资源的重新评估,应该仔细分析这些资源,以便就世界问题做出正确的决定。这样的大数据在统计上是有风险的,因为我们知道数据是(有用的)信号和(无用的)噪声的组合,这些数据被认为是需要过滤和处理的无组织事实。只使用信号而不使用噪声意味着数据经过重新构造和组织而变得有用,这就是信息。对于任何数据集,我们只需要信息。在信息论的背景下,熵被用作量化随机事件中最大信息量的统计度量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Elucidating the Impact of behavioral biases in Pakistan Stock Market: Moderating Impact of Financial Literacy Exploring the Determinants of Worker’s Remittances: An Application of LASSO Technique Thinking of Going Canting Again: A Study of Revisit Intention to Chinese Restaurants Impact of Bank Profitability on Default Risk: Empirical Evidence from Pakistan Forecasting COVID-19 Pandemic and Capital Market Efficiency in Africa
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1