Pub Date : 2024-04-13DOI: 10.1080/01966324.2024.2310648
F. Prataviera, G. Cordeiro
{"title":"The Unit Omega Distribution, Properties and Its Application","authors":"F. Prataviera, G. Cordeiro","doi":"10.1080/01966324.2024.2310648","DOIUrl":"https://doi.org/10.1080/01966324.2024.2310648","url":null,"abstract":"","PeriodicalId":35850,"journal":{"name":"American Journal of Mathematical and Management Sciences","volume":"85 10","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140707680","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-05DOI: 10.1080/01966324.2024.2311286
S. Dey, R. Al-mosawi
In this article, we study estimation methodologies for parameters of an unit Gompertz distribution based on two frequentist methods and Bayesian method using progressively Type II censored data. In frequentist approach, besides conventional maximum likelihood estimation, maximum product of spacing method is proposed for parameter estimation as an alternative approach to common maximum likelihood method. In order to obtain maximum likelihood estimates, we use both Newton-Raphson and stochastic expectation minimization algorithms, while for obtaining Bayes estimates for unknown parameters of the model, we have considered both traditional likelihood function as well as product of spacing function. Moreover, the approximate confidence intervals of the parameters are obtained under two the frequentist approaches and highest posterior density credible intervals of the parameters are obtained under Bayesian approaches using MCMC approach. In addition, percentile bootstrap technique is utilized to compute confidence intervals. Numerical comparisons are presented of the proposed estimators with respect to various criteria quantities using Monte Carlo simulations. Further, using different optimality criteria, an optimal censoring scheme has been suggested. Besides, one-sample and two-sample prediction problems based on observed sample and appropriate predictive intervals under Bayesian framework are discussed. Finally, to demonstrate the proposed methodology in a real-life scenario, maximum flood level data is considered to show the applicability of the proposed methods.
{"title":"Classical and Bayesian Inference of Unit Gompertz Distribution Based on Progressively Type II Censored Data","authors":"S. Dey, R. Al-mosawi","doi":"10.1080/01966324.2024.2311286","DOIUrl":"https://doi.org/10.1080/01966324.2024.2311286","url":null,"abstract":"In this article, we study estimation methodologies for parameters of an unit Gompertz distribution based on two frequentist methods and Bayesian method using progressively Type II censored data. In frequentist approach, besides conventional maximum likelihood estimation, maximum product of spacing method is proposed for parameter estimation as an alternative approach to common maximum likelihood method. In order to obtain maximum likelihood estimates, we use both Newton-Raphson and stochastic expectation minimization algorithms, while for obtaining Bayes estimates for unknown parameters of the model, we have considered both traditional likelihood function as well as product of spacing function. Moreover, the approximate confidence intervals of the parameters are obtained under two the frequentist approaches and highest posterior density credible intervals of the parameters are obtained under Bayesian approaches using MCMC approach. In addition, percentile bootstrap technique is utilized to compute confidence intervals. Numerical comparisons are presented of the proposed estimators with respect to various criteria quantities using Monte Carlo simulations. Further, using different optimality criteria, an optimal censoring scheme has been suggested. Besides, one-sample and two-sample prediction problems based on observed sample and appropriate predictive intervals under Bayesian framework are discussed. Finally, to demonstrate the proposed methodology in a real-life scenario, maximum flood level data is considered to show the applicability of the proposed methods.","PeriodicalId":35850,"journal":{"name":"American Journal of Mathematical and Management Sciences","volume":"40 10","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140736674","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-03DOI: 10.1080/01966324.2024.2309387
Manika Agarwal, P. K. Tripathi
{"title":"Classical and Bayes Analyses of Autoregressive Model with Heavy-Tailed Error","authors":"Manika Agarwal, P. K. Tripathi","doi":"10.1080/01966324.2024.2309387","DOIUrl":"https://doi.org/10.1080/01966324.2024.2309387","url":null,"abstract":"","PeriodicalId":35850,"journal":{"name":"American Journal of Mathematical and Management Sciences","volume":"52 13","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140748837","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-03DOI: 10.1080/01966324.2024.2311293
A. Barbiero, Asmerilda Hitaj
{"title":"An Alternative Discrete Analogue of the Half-Logistic Distribution Based on Minimization of a Distance between Cumulative Distribution Functions","authors":"A. Barbiero, Asmerilda Hitaj","doi":"10.1080/01966324.2024.2311293","DOIUrl":"https://doi.org/10.1080/01966324.2024.2311293","url":null,"abstract":"","PeriodicalId":35850,"journal":{"name":"American Journal of Mathematical and Management Sciences","volume":"138 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140746765","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-01-01DOI: 10.1080/01966324.2023.2275080
H. Khatun, M. Tripathy, Nabendu Pal
{"title":"Testing on the Quantiles of a Single Normal Population in the Presence of Several Normal Populations with a Common Variance","authors":"H. Khatun, M. Tripathy, Nabendu Pal","doi":"10.1080/01966324.2023.2275080","DOIUrl":"https://doi.org/10.1080/01966324.2023.2275080","url":null,"abstract":"","PeriodicalId":35850,"journal":{"name":"American Journal of Mathematical and Management Sciences","volume":"22 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139395709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-11-21DOI: 10.1080/01966324.2023.2275082
Yuqi Zheng, Tianrui Ye, Wenhao Gui
{"title":"Parameter Estimation of Inverted Exponentiated Half-Logistic Distribution under Progressive Type-II Censored Data with Competing Risks","authors":"Yuqi Zheng, Tianrui Ye, Wenhao Gui","doi":"10.1080/01966324.2023.2275082","DOIUrl":"https://doi.org/10.1080/01966324.2023.2275082","url":null,"abstract":"","PeriodicalId":35850,"journal":{"name":"American Journal of Mathematical and Management Sciences","volume":"64 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139253175","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-02DOI: 10.1080/01966324.2023.2256436
O. M. Khaled, H. M. Barakat, N. Khalil Rakha
AbstractThe primary goal of this study is to expand the application of the extreme value theorem by developing the modeling of extreme values using non-linear normalization. The issue of estimating the extreme value index (the non-zero extreme value index) under power and exponential normalization is addressed in this study. Under exponential normalization, counterparts of the Hill estimators for the extreme value index estimators under linear normalization are proposed based on the characteristics of the extreme value index, threshold, and the data itself. In addition, based on the generalized Pareto distributions, more condensed and flexible Hill estimators are proposed under power and exponential normalization. These proposed estimators assist us to choose the threshold more flexibly and getting rid of data waste. The R-package runs a thorough simulation analysis to examine the effectiveness of the suggested estimators.Keywords: Extreme value theoremgeneralized extreme value distributiongeneralized pareto distributionshill estimatorsmaximum likelihood methodnon-linear normalization AcknowledgementsThe authors are immensely grateful to Professor Madhuri S. Mulekar, the Editor in Chief of American Journal of Mathematical and Management Sciences, as well as the anonymous referees for their careful reading of the manuscript and their constructive detailed comments.Disclosure StatementNo potential conflict of interest was reported by the author(s).Data Availability StatementThe simulated data used to support the findings of this study are included within the article.
摘要本研究的主要目的是通过发展使用非线性归一化的极值建模来扩展极值定理的应用。本文研究了幂和指数归一化条件下极值指数(非零极值指数)的估计问题。在指数归一化条件下,根据极值指标、阈值和数据本身的特点,提出了线性归一化条件下极值指标估计量的希尔估计量。此外,在广义Pareto分布的基础上,在幂和指数归一化条件下,提出了更加简洁灵活的Hill估计。这些估计有助于我们更灵活地选择阈值,避免数据浪费。r包运行一个彻底的仿真分析,以检查建议的估计器的有效性。关键词:极值定理广义极值分布广义帕累托分布弥勒估计最大似然方法非线性归一化致谢作者非常感谢《美国数学与管理科学杂志》主编Madhuri S. Mulekar教授以及匿名审稿人对本文的认真阅读和建设性的详细意见。披露声明作者未报告潜在的利益冲突。数据可用性声明用于支持本研究结果的模拟数据包含在文章中。
{"title":"Extreme Value Index Estimation in the Extreme Value Theorem under Non-Linear Normalization","authors":"O. M. Khaled, H. M. Barakat, N. Khalil Rakha","doi":"10.1080/01966324.2023.2256436","DOIUrl":"https://doi.org/10.1080/01966324.2023.2256436","url":null,"abstract":"AbstractThe primary goal of this study is to expand the application of the extreme value theorem by developing the modeling of extreme values using non-linear normalization. The issue of estimating the extreme value index (the non-zero extreme value index) under power and exponential normalization is addressed in this study. Under exponential normalization, counterparts of the Hill estimators for the extreme value index estimators under linear normalization are proposed based on the characteristics of the extreme value index, threshold, and the data itself. In addition, based on the generalized Pareto distributions, more condensed and flexible Hill estimators are proposed under power and exponential normalization. These proposed estimators assist us to choose the threshold more flexibly and getting rid of data waste. The R-package runs a thorough simulation analysis to examine the effectiveness of the suggested estimators.Keywords: Extreme value theoremgeneralized extreme value distributiongeneralized pareto distributionshill estimatorsmaximum likelihood methodnon-linear normalization AcknowledgementsThe authors are immensely grateful to Professor Madhuri S. Mulekar, the Editor in Chief of American Journal of Mathematical and Management Sciences, as well as the anonymous referees for their careful reading of the manuscript and their constructive detailed comments.Disclosure StatementNo potential conflict of interest was reported by the author(s).Data Availability StatementThe simulated data used to support the findings of this study are included within the article.","PeriodicalId":35850,"journal":{"name":"American Journal of Mathematical and Management Sciences","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135902753","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-02DOI: 10.1080/01966324.2023.2255316
Eriky S. Gomes, Frederico R. B. Cruz, Saroja Kumar Singh
AbstractAlthough the single-server Markovian queues are one of the simplest models in Queue Theory, they have important practical applications. One of the initial steps for its application includes the determination of the necessary sample sizes for an interval estimation of its parameters. This includes the traffic intensity, which is defined as the ratio between the arrival rate and the service rate. In this article, we develop Bayesian algorithms to determine the size of samples that must be collected to guarantee a pre-specified mean amplitude or mean coverage for the traffic intensity. These samples are composed of the number of arrivals during service times, a practical way to collect data. Monte Carlo simulations attest to the efficiency and effectiveness of the algorithms proposed.Keywords: Bayesian inferencecredible regionMarkovian queuessample size AcknowledgmentsWe would like to thank the referees and the Editor-in-Chief for their detailed and insightful comments, which led to a much-improved manuscript.Authors’ ContributionsESG, FRBC, and SKS contributed equally to the design and implementation of the research, to the analysis of the results, and to the final writing of the manuscript.Disclosure StatementNo potential conflict of interest was reported by the author(s).Data Availability StatementThe data used to support the findings of this study are included in the article.Code Availability StatementThe proposed algorithms can be encoded in the reader’s favorite programming language. The R scripts can be obtained from the authors upon request.Additional informationFundingESG acknowledges CAPES (Coordenação de Aperfeiçoamento de Pessoal de Nìvel Superior, grant 88887.823719/2023-00 under Programa de Demanda Social at UFMG). FRBC acknowledges FAPEMIG (Fundação de Amparo à Pesquisa do Estado de Minas Gerais, grant CEX-PPM-00564-17) and CNPq (Conselho Nacional de Desenvolvimento Científico e Tecnológico, grant 305442/2022-8) for partial financial support. SKS acknowledges OSHEC (Odisha State Higher Education Council) for financial support under OURIIP Seed Fund, Govt. of Odisha, India with reference no. 22SF/ST/116 (Sanction Order Number 174/144/OSHEC).
单服务器马尔可夫队列虽然是队列理论中最简单的模型之一,但却有着重要的实际应用。其应用的初始步骤之一包括确定其参数区间估计所需的样本量。这包括交通强度,它被定义为到达率和服务率之间的比率。在本文中,我们开发了贝叶斯算法来确定必须收集的样本的大小,以保证预先指定的平均振幅或交通强度的平均覆盖。这些样本由服务时间内到达的数量组成,这是一种收集数据的实用方法。蒙特卡洛仿真验证了所提算法的效率和有效性。关键字:贝叶斯推断可信区域马尔可夫队列样本大小致谢我们要感谢审稿人和主编详细而深刻的意见,这使得我们的手稿得到了很大的改进。sesg、FRBC和SKS对研究的设计和实施、结果分析和最终稿件的撰写做出了同等的贡献。披露声明作者未报告潜在的利益冲突。数据可用性声明用于支持本研究结果的数据包含在文章中。代码可用性声明所提出的算法可以用读者最喜欢的编程语言进行编码。R脚本可以根据要求从作者那里获得。esg承认CAPES的资金来源(Nìvel Superior项目协调委员会,资助88887.823719/2023-00,umg社会需求项目)。FRBC感谢FAPEMIG (FAPEMIG) (Minas Gerais州政府基金,资助ceex - ppm -00564-17)和CNPq (Conselho Nacional de Desenvolvimento Científico e Tecnológico,资助305442/2022-8)的部分财政支持。SKS感谢OSHEC(奥里萨邦高等教育委员会)在印度奥里萨邦政府OURIIP种子基金下的财政支持。22SF/ST/116(制裁令号174/144/OSHEC)。
{"title":"Algorithms for Determination of Sample Sizes for Bayesian Estimations in Single-Server Markovian Queues","authors":"Eriky S. Gomes, Frederico R. B. Cruz, Saroja Kumar Singh","doi":"10.1080/01966324.2023.2255316","DOIUrl":"https://doi.org/10.1080/01966324.2023.2255316","url":null,"abstract":"AbstractAlthough the single-server Markovian queues are one of the simplest models in Queue Theory, they have important practical applications. One of the initial steps for its application includes the determination of the necessary sample sizes for an interval estimation of its parameters. This includes the traffic intensity, which is defined as the ratio between the arrival rate and the service rate. In this article, we develop Bayesian algorithms to determine the size of samples that must be collected to guarantee a pre-specified mean amplitude or mean coverage for the traffic intensity. These samples are composed of the number of arrivals during service times, a practical way to collect data. Monte Carlo simulations attest to the efficiency and effectiveness of the algorithms proposed.Keywords: Bayesian inferencecredible regionMarkovian queuessample size AcknowledgmentsWe would like to thank the referees and the Editor-in-Chief for their detailed and insightful comments, which led to a much-improved manuscript.Authors’ ContributionsESG, FRBC, and SKS contributed equally to the design and implementation of the research, to the analysis of the results, and to the final writing of the manuscript.Disclosure StatementNo potential conflict of interest was reported by the author(s).Data Availability StatementThe data used to support the findings of this study are included in the article.Code Availability StatementThe proposed algorithms can be encoded in the reader’s favorite programming language. The R scripts can be obtained from the authors upon request.Additional informationFundingESG acknowledges CAPES (Coordenação de Aperfeiçoamento de Pessoal de Nìvel Superior, grant 88887.823719/2023-00 under Programa de Demanda Social at UFMG). FRBC acknowledges FAPEMIG (Fundação de Amparo à Pesquisa do Estado de Minas Gerais, grant CEX-PPM-00564-17) and CNPq (Conselho Nacional de Desenvolvimento Científico e Tecnológico, grant 305442/2022-8) for partial financial support. SKS acknowledges OSHEC (Odisha State Higher Education Council) for financial support under OURIIP Seed Fund, Govt. of Odisha, India with reference no. 22SF/ST/116 (Sanction Order Number 174/144/OSHEC).","PeriodicalId":35850,"journal":{"name":"American Journal of Mathematical and Management Sciences","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135902372","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Univariate Weibull distribution is a well known lifetime distribution and has been widely used in reliability and survival analysis. In this paper, we introduce a new family of bivariate generalized Weibull (BGW) distributions, whose univariate marginals are exponentiated Weibull distribution. Different statistical quantiles like marginals, conditional distribution, conditional expectation, product moments, correlation and a measure component reliability are derived. Various measures of dependence and statistical properties along with aging properties are examined. Further, the copula associated with BGW distribution and its various important properties are also considered. The methods of maximum likelihood and Bayesian estimation are employed to estimate unknown parameters of the model. A Monte Carlo simulation and real data study are carried out to demonstrate the performance of the estimators and results have proven the effectiveness of the distribution in real-life situations.
{"title":"A Novel Bivariate Generalized Weibull Distribution with Properties and Applications","authors":"Ashok Kumar Pathak, Mohd. Arshad, Qazi J. Azhad, Mukti Khetan, Arvind Pandey","doi":"10.1080/01966324.2023.2239963","DOIUrl":"https://doi.org/10.1080/01966324.2023.2239963","url":null,"abstract":"Univariate Weibull distribution is a well known lifetime distribution and has been widely used in reliability and survival analysis. In this paper, we introduce a new family of bivariate generalized Weibull (BGW) distributions, whose univariate marginals are exponentiated Weibull distribution. Different statistical quantiles like marginals, conditional distribution, conditional expectation, product moments, correlation and a measure component reliability are derived. Various measures of dependence and statistical properties along with aging properties are examined. Further, the copula associated with BGW distribution and its various important properties are also considered. The methods of maximum likelihood and Bayesian estimation are employed to estimate unknown parameters of the model. A Monte Carlo simulation and real data study are carried out to demonstrate the performance of the estimators and results have proven the effectiveness of the distribution in real-life situations.","PeriodicalId":35850,"journal":{"name":"American Journal of Mathematical and Management Sciences","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136107446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-04DOI: 10.1080/01966324.2023.2239961
Cenk Çalışkan
Abstract In the classical EOQ model, the annual inventory holding cost per unit is defined as a fixed percentage of the unit price of the item. A portion of the inventory holding cost is the opportunity cost of capital tied up in the inventory, which is based on the best interest rate or the rate of return for the best alternative investment, and assumed as simple interest. In finance and banking, compound interest is the standard and simple interest is very rare; so it is not realistic to use an opportunity cost based on simple interest. To overcome this problem, a number of net present value (NPV)-based approaches have been proposed in the literature but they all recommend the standard EOQ formula as an approximate optimal solution. In this research, we propose an extension of the basic model that uses compound interest for the opportunity cost and allows planned backorders. A closed-form optimal solution is not possible for this model due to the exponential terms in the total cost function. We develop a reasonable approximate model and derive the optimal solution that is intuitive and different from the standard EOQ solution. We show that our solution is very close to the solution of the exact model.
{"title":"The Economic Order Quantity Model under Compound Interest with Planned Backorders","authors":"Cenk Çalışkan","doi":"10.1080/01966324.2023.2239961","DOIUrl":"https://doi.org/10.1080/01966324.2023.2239961","url":null,"abstract":"Abstract In the classical EOQ model, the annual inventory holding cost per unit is defined as a fixed percentage of the unit price of the item. A portion of the inventory holding cost is the opportunity cost of capital tied up in the inventory, which is based on the best interest rate or the rate of return for the best alternative investment, and assumed as simple interest. In finance and banking, compound interest is the standard and simple interest is very rare; so it is not realistic to use an opportunity cost based on simple interest. To overcome this problem, a number of net present value (NPV)-based approaches have been proposed in the literature but they all recommend the standard EOQ formula as an approximate optimal solution. In this research, we propose an extension of the basic model that uses compound interest for the opportunity cost and allows planned backorders. A closed-form optimal solution is not possible for this model due to the exponential terms in the total cost function. We develop a reasonable approximate model and derive the optimal solution that is intuitive and different from the standard EOQ solution. We show that our solution is very close to the solution of the exact model.","PeriodicalId":35850,"journal":{"name":"American Journal of Mathematical and Management Sciences","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"59260165","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}