Pub Date : 2020-07-01DOI: 10.19139/soic-2310-5070-784
N. V. Tan
In this paper, we study the density of the solution to a class of stochastic functional differential equations driven by fractional Brownian motion. Based on the techniques of Malliavin calculus, we prove the smoothness and establish upper and lower Gaussian estimates for the density.
{"title":"Smoothness and Gaussian Density Estimates for Stochastic Functional Differential Equations with Fractional Noise","authors":"N. V. Tan","doi":"10.19139/soic-2310-5070-784","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-784","url":null,"abstract":"In this paper, we study the density of the solution to a class of stochastic functional differential equations driven by fractional Brownian motion. Based on the techniques of Malliavin calculus, we prove the smoothness and establish upper and lower Gaussian estimates for the density.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"8 1","pages":"822-833"},"PeriodicalIF":0.0,"publicationDate":"2020-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48962611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-07-01DOI: 10.19139/SOIC-2310-5070-812
Abbas Eftekharian, M. Razmkhah, J. Ahmadi
A flexible ranked set sampling scheme including some various existing sampling methods is proposed. This scheme may be used to minimize the error of ranking and the cost of sampling. Based on the data obtained from this scheme, the maximum likelihood estimation as well as the Fisher information are studied for the scale family of distributions. The existence and uniqueness of the maximum likelihood estimator of the scale parameter of the exponential and normal distributions are investigated. Moreover, the optimal scheme is derived via simulation and numerical computations.
{"title":"A flexible ranked set sampling schemes: Statistical analysis on scale parameter","authors":"Abbas Eftekharian, M. Razmkhah, J. Ahmadi","doi":"10.19139/SOIC-2310-5070-812","DOIUrl":"https://doi.org/10.19139/SOIC-2310-5070-812","url":null,"abstract":"A flexible ranked set sampling scheme including some various existing sampling methods is proposed. This scheme may be used to minimize the error of ranking and the cost of sampling. Based on the data obtained from this scheme, the maximum likelihood estimation as well as the Fisher information are studied for the scale family of distributions. The existence and uniqueness of the maximum likelihood estimator of the scale parameter of the exponential and normal distributions are investigated. Moreover, the optimal scheme is derived via simulation and numerical computations.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"9 1","pages":"189-203"},"PeriodicalIF":0.0,"publicationDate":"2020-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46451456","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-07-01DOI: 10.19139/soic-2310-5070-760
N. Dung, Trinh Nhu Quynh
In this paper, we study the distribution of the integrated Jacobi diffusion processes with Brownian noise and fractional Brownian noise. Based on techniques of Malliavin calculus, we develop a unified method to obtain explicit estimates for the tail distribution of these integrated diffusions.
{"title":"Tail distribution of the integrated Jacobi diffusion process","authors":"N. Dung, Trinh Nhu Quynh","doi":"10.19139/soic-2310-5070-760","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-760","url":null,"abstract":"In this paper, we study the distribution of the integrated Jacobi diffusion processes with Brownian noise and fractional Brownian noise. Based on techniques of Malliavin calculus, we develop a unified method to obtain explicit estimates for the tail distribution of these integrated diffusions.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"8 1","pages":"790-800"},"PeriodicalIF":0.0,"publicationDate":"2020-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44125055","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-30DOI: 10.19139/SOIC-2310-5070-781
A. Sadeghpour, A. Nezakati, M. Salehi
In this paper, point and interval estimation of stress-strength reliability based on lower record ranked set sampling (RRSS) scheme under the proportional reversed hazard rate model are considered. Maximum likelihood, uniformly minimum variance unbiased estimator, and Bayesian estimators of R are derived. Also, we compared this point estimators with their counterparts obtained by well-known sampling scheme in record values known as inverse sampling scheme. Various confidence intervals for the parameter R are constructed, and compared based on the simulation study. Moreover, the RRSS scheme is compared with ordinary records in case of interval estimations. We observed that our proposed point and interval estimations perform well in the estimation of R based on RRSS. We also proved that all calculations do not depend on the baseline distribution in the proportional reversed hazard rate model. Finally, a data set has been analyzed for illustrative purposes.
{"title":"Comparison of two sampling schemes in estimating the stress-strength reliability under the proportional reversed hazard rate model","authors":"A. Sadeghpour, A. Nezakati, M. Salehi","doi":"10.19139/SOIC-2310-5070-781","DOIUrl":"https://doi.org/10.19139/SOIC-2310-5070-781","url":null,"abstract":"In this paper, point and interval estimation of stress-strength reliability based on lower record ranked set sampling (RRSS) scheme under the proportional reversed hazard rate model are considered. Maximum likelihood, uniformly minimum variance unbiased estimator, and Bayesian estimators of R are derived. Also, we compared this point estimators with their counterparts obtained by well-known sampling scheme in record values known as inverse sampling scheme. Various confidence intervals for the parameter R are constructed, and compared based on the simulation study. Moreover, the RRSS scheme is compared with ordinary records in case of interval estimations. We observed that our proposed point and interval estimations perform well in the estimation of R based on RRSS. We also proved that all calculations do not depend on the baseline distribution in the proportional reversed hazard rate model. Finally, a data set has been analyzed for illustrative purposes.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"9 1","pages":"82-98"},"PeriodicalIF":0.0,"publicationDate":"2020-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44624134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-23DOI: 10.19139/SOIC-2310-5070-887
Lazhar Benkhelifa
A new lifetime model, with four positive parameters, called the Weibull Birnbaum-Saunders distribution is proposed. The proposed model extends the Birnbaum-Saunders distribution and provides great flexibility in modeling data in practice. Some mathematical properties of the new distribution are obtained including expansions for the cumulative and density functions, moments, generating function, mean deviations, order statistics and reliability. Estimation of the model parameters is carried out by the maximum likelihood estimation method. A simulation study is presented to show the performance of the maximum likelihood estimates of the model parameters. The flexibility of the new model is examined by applying it to two real data sets.
{"title":"The Weibull Birnbaum-Saunders Distribution And Its Applications","authors":"Lazhar Benkhelifa","doi":"10.19139/SOIC-2310-5070-887","DOIUrl":"https://doi.org/10.19139/SOIC-2310-5070-887","url":null,"abstract":"A new lifetime model, with four positive parameters, called the Weibull Birnbaum-Saunders distribution is proposed. The proposed model extends the Birnbaum-Saunders distribution and provides great flexibility in modeling data in practice. Some mathematical properties of the new distribution are obtained including expansions for the cumulative and density functions, moments, generating function, mean deviations, order statistics and reliability. Estimation of the model parameters is carried out by the maximum likelihood estimation method. A simulation study is presented to show the performance of the maximum likelihood estimates of the model parameters. The flexibility of the new model is examined by applying it to two real data sets.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"9 1","pages":"61-81"},"PeriodicalIF":0.0,"publicationDate":"2020-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47643536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-19DOI: 10.19139/soic-2310-5070-915
O. Kostyukova, T. Tchemisova
In this paper, we consider a special class of conic optimization problems, consisting of set-semidefinite (or Ksemidefinite) programming problems, where the set K is a polyhedral convex cone. For these problems, we introduce the concept of immobile indices and study the properties of the set of normalized immobile indices and the feasible set. This study provides the main result of the paper, which is to formulate and prove the new first-order optimality conditions in the form of a criterion. The optimality conditions are explicit and do not use any constraint qualifications. For the case of a linear cost function, we reformulate the K-semidefinite problem in a regularized form and construct its dual. We show that the pair of the primal and dual regularized problems satisfies the strong duality relation which means that the duality gap is vanishing.
{"title":"CQ-free optimality conditions and strong dual formulations for a special conic optimization problem","authors":"O. Kostyukova, T. Tchemisova","doi":"10.19139/soic-2310-5070-915","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-915","url":null,"abstract":"In this paper, we consider a special class of conic optimization problems, consisting of set-semidefinite (or Ksemidefinite) programming problems, where the set K is a polyhedral convex cone. For these problems, we introduce the concept of immobile indices and study the properties of the set of normalized immobile indices and the feasible set. This study provides the main result of the paper, which is to formulate and prove the new first-order optimality conditions in the form of a criterion. The optimality conditions are explicit and do not use any constraint qualifications. For the case of a linear cost function, we reformulate the K-semidefinite problem in a regularized form and construct its dual. We show that the pair of the primal and dual regularized problems satisfies the strong duality relation which means that the duality gap is vanishing.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"8 1","pages":"668-683"},"PeriodicalIF":0.0,"publicationDate":"2020-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44745776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-18DOI: 10.19139/SOIC-2310-5070-611
S. Ashour, A. El-sheikh, A. Elshahhat
In this paper, the Bayesian and non-Bayesian estimation of a two-parameter Weibull lifetime model in presence of progressive first-failure censored data with binomial random removals are considered. Based on the s-normal approximation to the asymptotic distribution of maximum likelihood estimators, two-sided approximate confidence intervals for the unknown parameters are constructed. Using gamma conjugate priors, several Bayes estimates and associated credible intervals are obtained relative to the squared error loss function. Proposed estimators cannot be expressed in closed forms and can be evaluated numerically by some suitable iterative procedure. A Bayesian approach is developed using Markov chain Monte Carlo techniques to generate samples from the posterior distributions and in turn computing the Bayes estimates and associated credible intervals. To analyze the performance of the proposed estimators, a Monte Carlo simulation study is conducted. Finally, a real data set is discussed for illustration purposes.
{"title":"Inferences for Weibull parameters under progressively first-failure censored data with binomial random removals","authors":"S. Ashour, A. El-sheikh, A. Elshahhat","doi":"10.19139/SOIC-2310-5070-611","DOIUrl":"https://doi.org/10.19139/SOIC-2310-5070-611","url":null,"abstract":"In this paper, the Bayesian and non-Bayesian estimation of a two-parameter Weibull lifetime model in presence of progressive first-failure censored data with binomial random removals are considered. Based on the s-normal approximation to the asymptotic distribution of maximum likelihood estimators, two-sided approximate confidence intervals for the unknown parameters are constructed. Using gamma conjugate priors, several Bayes estimates and associated credible intervals are obtained relative to the squared error loss function. Proposed estimators cannot be expressed in closed forms and can be evaluated numerically by some suitable iterative procedure. A Bayesian approach is developed using Markov chain Monte Carlo techniques to generate samples from the posterior distributions and in turn computing the Bayes estimates and associated credible intervals. To analyze the performance of the proposed estimators, a Monte Carlo simulation study is conducted. Finally, a real data set is discussed for illustration purposes.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"27 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86821463","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-14DOI: 10.19139/soic-2310-5070-557
Luiz Paulo Fávero, P. Belfiore, Marco Aurélio dos Santos, R. F. Souza
Stata has several procedures that can be used in analyzing count-data regression models and, more specifically, in studying the behavior of the dependent variable, conditional on explanatory variables. Identifying overdispersion in countdata models is one of the most important procedures that allow researchers to correctly choose estimations such as Poisson or negative binomial, given the distribution of the dependent variable. The main purpose of this paper is to present a new command for the identification of overdispersion in the data as an alternative to the procedure presented by Cameron and Trivedi [5], since it directly identifies overdispersion in the data, without the need to previously estimate a specific type of count-data model. When estimating Poisson or negative binomial regression models in which the dependent variable is quantitative, with discrete and non-negative values, the new Stata package overdisp helps researchers to directly propose more consistent and adequate models. As a second contribution, we also present a simulation to show the consistency of the overdispersion test using the overdisp command. Findings show that, if the test indicates equidispersion in the data, there are consistent evidence that the distribution of the dependent variable is, in fact, Poisson. If, on the other hand, the test indicates overdispersion in the data, researchers should investigate more deeply whether the dependent variable actually exhibits better adherence to the Poisson-Gamma distribution or not.
{"title":"Overdisp: A Stata (and Mata) Package for Direct Detection of Overdispersion in Poisson and Negative Binomial Regression Models","authors":"Luiz Paulo Fávero, P. Belfiore, Marco Aurélio dos Santos, R. F. Souza","doi":"10.19139/soic-2310-5070-557","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-557","url":null,"abstract":"Stata has several procedures that can be used in analyzing count-data regression models and, more specifically, in studying the behavior of the dependent variable, conditional on explanatory variables. Identifying overdispersion in countdata models is one of the most important procedures that allow researchers to correctly choose estimations such as Poisson or negative binomial, given the distribution of the dependent variable. The main purpose of this paper is to present a new command for the identification of overdispersion in the data as an alternative to the procedure presented by Cameron and Trivedi [5], since it directly identifies overdispersion in the data, without the need to previously estimate a specific type of count-data model. When estimating Poisson or negative binomial regression models in which the dependent variable is quantitative, with discrete and non-negative values, the new Stata package overdisp helps researchers to directly propose more consistent and adequate models. As a second contribution, we also present a simulation to show the consistency of the overdispersion test using the overdisp command. Findings show that, if the test indicates equidispersion in the data, there are consistent evidence that the distribution of the dependent variable is, in fact, Poisson. If, on the other hand, the test indicates overdispersion in the data, researchers should investigate more deeply whether the dependent variable actually exhibits better adherence to the Poisson-Gamma distribution or not.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"8 1","pages":"773-789"},"PeriodicalIF":0.0,"publicationDate":"2020-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42759716","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-14DOI: 10.19139/soic-2310-5070-890
T. Khodamoradi, M. Salahi, A. Najafi
In this paper, first, we discuss some drawbacks of the cardinality constrained mean-variance (CCMV) portfolio optimization with short selling and risk-neutral interest rate when the lower and upper bounds of the assets contributions are − 1 K and 1 K (K denotes the number of assets in portfolio). Second, we present an improved variant using absolute returns instead of the returns to include short selling in the model. Finally, some numerical results are provided using the data set of the S&P 500 index, Information Technology, and the MIBTEL index in terms of returns and Sharpe ratios to compare the proposed models with those in the literature.
{"title":"A Note on CCMV Portfolio Optimization Model with Short Selling and Risk-neutral Interest Rate","authors":"T. Khodamoradi, M. Salahi, A. Najafi","doi":"10.19139/soic-2310-5070-890","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-890","url":null,"abstract":"In this paper, first, we discuss some drawbacks of the cardinality constrained mean-variance (CCMV) portfolio optimization with short selling and risk-neutral interest rate when the lower and upper bounds of the assets contributions are − 1 K and 1 K (K denotes the number of assets in portfolio). Second, we present an improved variant using absolute returns instead of the returns to include short selling in the model. Finally, some numerical results are provided using the data set of the S&P 500 index, Information Technology, and the MIBTEL index in terms of returns and Sharpe ratios to compare the proposed models with those in the literature.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"8 1","pages":"740-748"},"PeriodicalIF":0.0,"publicationDate":"2020-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47013731","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-10DOI: 10.19139/soic-2310-5070-976
A. Verster, Lizanne Raubenheimer Department of Mathematical Statistics, Actuarial Science, U. State, Bloemfontein, S. Africa, School of Mathematical, Statistical Sciences, North-West University, Potchefstroom
In Extreme Value methodology the choice of threshold plays an important role in efficient modelling of observations exceeding the threshold. The threshold must be chosen high enough to ensure an unbiased extreme value index but choosing the threshold too high results in uncontrolled variances. This paper investigates a generalized model that can assist in the choice of optimal threshold values in the γ positive domain. A Bayesian approach is considered by deriving a posterior distribution for the unknown generalized parameter. Using the properties of the posterior distribution allows for a method to choose an optimal threshold without visual inspection.
{"title":"A Different Approach for Choosing a Threshold in Peaks over Threshold","authors":"A. Verster, Lizanne Raubenheimer Department of Mathematical Statistics, Actuarial Science, U. State, Bloemfontein, S. Africa, School of Mathematical, Statistical Sciences, North-West University, Potchefstroom","doi":"10.19139/soic-2310-5070-976","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-976","url":null,"abstract":"In Extreme Value methodology the choice of threshold plays an important role in efficient modelling of observations exceeding the threshold. The threshold must be chosen high enough to ensure an unbiased extreme value index but choosing the threshold too high results in uncontrolled variances. This paper investigates a generalized model that can assist in the choice of optimal threshold values in the γ positive domain. A Bayesian approach is considered by deriving a posterior distribution for the unknown generalized parameter. Using the properties of the posterior distribution allows for a method to choose an optimal threshold without visual inspection.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"34 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84410200","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}