{"title":"The entropy corrected geometric Brownian motion","authors":"Rishabh Gupta, Ewa Drzazga-Szczȩśniak, Sabre Kais, Dominik Szczȩśniak","doi":"arxiv-2403.06253","DOIUrl":null,"url":null,"abstract":"The geometric Brownian motion (GBM) is widely employed for modeling\nstochastic processes, yet its solutions are characterized by the log-normal\ndistribution. This comprises predictive capabilities of GBM mainly in terms of\nforecasting applications. Here, entropy corrections to GBM are proposed to go\nbeyond log-normality restrictions and better account for intricacies of real\nsystems. It is shown that GBM solutions can be effectively refined by arguing\nthat entropy is reduced when deterministic content of considered data\nincreases. Notable improvements over conventional GBM are observed for several\ncases of non-log-normal distributions, ranging from a dice roll experiment to\nreal world data.","PeriodicalId":501139,"journal":{"name":"arXiv - QuantFin - Statistical Finance","volume":"7 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuantFin - Statistical Finance","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2403.06253","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The geometric Brownian motion (GBM) is widely employed for modeling
stochastic processes, yet its solutions are characterized by the log-normal
distribution. This comprises predictive capabilities of GBM mainly in terms of
forecasting applications. Here, entropy corrections to GBM are proposed to go
beyond log-normality restrictions and better account for intricacies of real
systems. It is shown that GBM solutions can be effectively refined by arguing
that entropy is reduced when deterministic content of considered data
increases. Notable improvements over conventional GBM are observed for several
cases of non-log-normal distributions, ranging from a dice roll experiment to
real world data.