Pub Date : 2023-07-09DOI: 10.19139/soic-2310-5070-1501
J. Kazempoor, A. Habibirad, Adel Ahmadi Nadi, Gholam Reza Mohtashami Borzadaran
This paper provides four well-known statistical inferences for the principal parameters regarding the two-parameter Weibull distribution including its hazard, quantile, and survival function based on an adaptive progressive type-II censoring plan. The statistical inferences involve the likelihood and approximate likelihood methods, the Bayesian approach, the bootstrap procedure, and a new conditional technique. To construct Bayesian point estimators and credible intervals, Markov chain Monte Carlo, Metropolis-Hastings, and Gibbs sampling algorithms were used. The Bayesian estimators are developed under conjugate and non-conjugate priors and in the presence of symmetric and asymmetric loss functions. In addition, a conditional estimation technique with interesting distributional characteristics has been introduced. The aforementioned methods are compared extensively through a series of simulations. The results of comparative study showed the superiority of the conditional approach over the other ones. Finally, the developed methods are applied to analyze well-known wind speed data.
{"title":"Statistical inferences for the Weibull distribution under adaptive progressive type-II censoring plan and their application in wind speed data analysis","authors":"J. Kazempoor, A. Habibirad, Adel Ahmadi Nadi, Gholam Reza Mohtashami Borzadaran","doi":"10.19139/soic-2310-5070-1501","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-1501","url":null,"abstract":"This paper provides four well-known statistical inferences for the principal parameters regarding the two-parameter Weibull distribution including its hazard, quantile, and survival function based on an adaptive progressive type-II censoring plan. The statistical inferences involve the likelihood and approximate likelihood methods, the Bayesian approach, the bootstrap procedure, and a new conditional technique. To construct Bayesian point estimators and credible intervals, Markov chain Monte Carlo, Metropolis-Hastings, and Gibbs sampling algorithms were used. The Bayesian estimators are developed under conjugate and non-conjugate priors and in the presence of symmetric and asymmetric loss functions. In addition, a conditional estimation technique with interesting distributional characteristics has been introduced. The aforementioned methods are compared extensively through a series of simulations. The results of comparative study showed the superiority of the conditional approach over the other ones. Finally, the developed methods are applied to analyze well-known wind speed data.","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"161 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123779298","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-08DOI: 10.21203/rs.3.rs-2357034/v1
A. Dermoune, Daoud Ounaissi, Yousri Slaoui
The weighted median plays a central role in the least absolute deviations (LAD). We propose a nonlinear regression using (LAD). Our objective function $f(a, l, s)$ is non-convex with respect to the parameters a, l, s, and is such that for each fixed l, s the minimizer of $ato f (a, l, s)$ is the weighted median $med(x(l, s), w(l, s))$ of a sequence $x(l, s)$ endowed with the weights $w(l, s)$ (all depend on $l$, $s$). We analyse and compare theoretically the minimizers of the function $(a, l, s)to f (a, l, s)$ and the surface $(l, s) to f (med(x(l, s), w(l, s)), l, s)$. As a numerical application we propose to fit the daily infections of COVID 19 in China using Gaussian model. We derive confident interval for the daily infections from each local minimum.
{"title":"Confidence intervals from local minimums of objective function","authors":"A. Dermoune, Daoud Ounaissi, Yousri Slaoui","doi":"10.21203/rs.3.rs-2357034/v1","DOIUrl":"https://doi.org/10.21203/rs.3.rs-2357034/v1","url":null,"abstract":"The weighted median plays a central role in the least absolute deviations (LAD). We propose a nonlinear regression using (LAD). Our objective function $f(a, l, s)$ is non-convex with respect to the parameters a, l, s, and is such that for each fixed l, s the minimizer of $ato f (a, l, s)$ is the weighted median $med(x(l, s), w(l, s))$ of a sequence $x(l, s)$ endowed with the weights $w(l, s)$ (all depend on $l$, $s$). We analyse and compare theoretically the minimizers of the function $(a, l, s)to f (a, l, s)$ and the surface $(l, s) to f (med(x(l, s), w(l, s)), l, s)$. As a numerical application we propose to fit the daily infections of COVID 19 in China using Gaussian model. We derive confident interval for the daily infections from each local minimum.","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130348828","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-03DOI: 10.19139/soic-2310-5070-1314
M. Hashempour, M. Alizadeh
In this paper, we introduce a new two-parameter lifetime distribution based on arctan function which is called weighted Half-Logistic (WHL) distribution. Theoretical properties of this model including quantile function, extreme value, linear combination for pdf and cdf, moments, conditional moments, moment generating function and mean deviation are derived and studied in details. The maximum likelihood estimates of parameters are compared with various methods of estimations by conducting a simulation study. Finally, two real data sets show that this model p[rovide better fit than other competitive known models.
{"title":"A New Weighted Half-Logistic Distribution:Properties, Applications and Different Method of Estimations","authors":"M. Hashempour, M. Alizadeh","doi":"10.19139/soic-2310-5070-1314","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-1314","url":null,"abstract":"In this paper, we introduce a new two-parameter lifetime distribution based on arctan function which is called weighted Half-Logistic (WHL) distribution. Theoretical properties of this model including quantile function, extreme value, linear combination for pdf and cdf, moments, conditional moments, moment generating function and mean deviation are derived and studied in details. The maximum likelihood estimates of parameters are compared with various methods of estimations by conducting a simulation study. Finally, two real data sets show that this model p[rovide better fit than other competitive known models.","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124894818","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-03DOI: 10.19139/soic-2310-5070-1333
S. A. Othman, Haithem Taha Mohammed Ali
The aim of this paper is to select an appropriate ARIMA model for the time series after transforming the original responses. Box-Cox and Yeo-Johnson power transformation models were used on the response variables of two time series datasets of average temperatures and then diagnosed and built the appropriate ARIMA models for each time-series. The authors treat the results of the model fitting as a package in an attempt to decide and choose the best model by diagnosing the effect of the data transformation on the response normality, significant of estimated model parameters, forecastability and the behavior of the residuals. The authors conclude that the Yeo-Johnson model was more flexible in smoothing the data and contributedto accessing a simple model with good forecastability.
{"title":"On the Use of the Power Transformation Models to Improve the Temperature Time Series","authors":"S. A. Othman, Haithem Taha Mohammed Ali","doi":"10.19139/soic-2310-5070-1333","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-1333","url":null,"abstract":"The aim of this paper is to select an appropriate ARIMA model for the time series after transforming the original responses. Box-Cox and Yeo-Johnson power transformation models were used on the response variables of two time series datasets of average temperatures and then diagnosed and built the appropriate ARIMA models for each time-series. The authors treat the results of the model fitting as a package in an attempt to decide and choose the best model by diagnosing the effect of the data transformation on the response normality, significant of estimated model parameters, forecastability and the behavior of the residuals. The authors conclude that the Yeo-Johnson model was more flexible in smoothing the data and contributedto accessing a simple model with good forecastability.","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128266894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-11DOI: 10.19139/soic-2310-5070-1763
Sidong Liu, Handong Cao, Xijian Wang
A carbon emission trading self-scheduling (CETSS) model was proposed. The proposed model considered not only carbon emission allowance constraints but also carbon emission trading. A new method was presented for solving CETSS problems based on piece-wise linearisation and second-order cone linearisation. The effectiveness and validity of the proposed model and method were illustrated by 10-100 unit systems over 24 hours.
{"title":"Self-Scheduling of a Generation Company with Carbon Emission Trading","authors":"Sidong Liu, Handong Cao, Xijian Wang","doi":"10.19139/soic-2310-5070-1763","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-1763","url":null,"abstract":"A carbon emission trading self-scheduling (CETSS) model was proposed. The proposed model considered not only carbon emission allowance constraints but also carbon emission trading. A new method was presented for solving CETSS problems based on piece-wise linearisation and second-order cone linearisation. The effectiveness and validity of the proposed model and method were illustrated by 10-100 unit systems over 24 hours.","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123538474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-24DOI: 10.19139/soic-2310-5070-1792
M. Luz, M. Moklyachuk
We deal with the problem of optimal estimation of the linear functionals constructed from unobserved values of a continuous time stochastic process with periodically correlated increments based on past observations of this process. To solve the problem, we construct a corresponding to the process sequence of stochastic functions which forms an infinite dimensional vector stationary increment sequence. In the case of known spectral density of the stationary increment sequence, we obtain formulas for calculating values of the mean square errors and the spectral characteristics of the optimal estimates of the functionals. Formulas determining the least favorable spectral densities and the minimax (robust) spectral characteristics of the optimal linear estimates of functionals are derived in the case where the sets of admissible spectral densities are given.
{"title":"Estimation problem for continuous time stochastic processes with periodically correlated increments","authors":"M. Luz, M. Moklyachuk","doi":"10.19139/soic-2310-5070-1792","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-1792","url":null,"abstract":"We deal with the problem of optimal estimation of the linear functionals constructed from unobserved values of a continuous time stochastic process with periodically correlated increments based on past observations of this process. To solve the problem, we construct a corresponding to the process sequence of stochastic functions which forms an infinite dimensional vector stationary increment sequence. In the case of known spectral density of the stationary increment sequence, we obtain formulas for calculating values of the mean square errors and the spectral characteristics of the optimal estimates of the functionals. Formulas determining the least favorable spectral densities and the minimax (robust) spectral characteristics of the optimal linear estimates of functionals are derived in the case where the sets of admissible spectral densities are given.","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125334968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-21DOI: 10.19139/soic-2310-5070-1453
Irshad M R, Maya R, Archana K, Tahmasebi S
Ranked set sampling is considered as an alternative to simple random sampling and maximum ranked set sampling is a very useful modification of ranked set sampling. In this paper we focused on information content of ranked set sampling and maximum ranked set sampling with unequal samples in terms of past extropy measure and also considered the information content of negative cumulative extropy and its dynamic version based on maximum ranked set sampling and simple random sampling designs. We also compare ranked set sampling data, maximum ranked set sampling data with simple random sampling and with each other. Also here we obtained a new discrimination information measure among simple random sampling data, ranked set sampling data and maximum ranked set sampling data for past extropy measure.
{"title":"On Past Extropy and Negative Cumulative Extropy Properties of Ranked Set Sampling and Maximum Ranked Set Sampling with Unequal Samples","authors":"Irshad M R, Maya R, Archana K, Tahmasebi S","doi":"10.19139/soic-2310-5070-1453","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-1453","url":null,"abstract":"Ranked set sampling is considered as an alternative to simple random sampling and maximum ranked set sampling is a very useful modification of ranked set sampling. In this paper we focused on information content of ranked set sampling and maximum ranked set sampling with unequal samples in terms of past extropy measure and also considered the information content of negative cumulative extropy and its dynamic version based on maximum ranked set sampling and simple random sampling designs. We also compare ranked set sampling data, maximum ranked set sampling data with simple random sampling and with each other. Also here we obtained a new discrimination information measure among simple random sampling data, ranked set sampling data and maximum ranked set sampling data for past extropy measure.","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"348 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115631368","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-21DOI: 10.19139/soic-2310-5070-1381
Safa Guerdouh, W. Chikouche, Imene Touil
In this paper, we propose a path-following interior-point method (IPM) for solving linear optimization (LO) problems based on a new kernel function (KF). The latter differs from other KFs in having an exponential-hyperbolic barrier term that belongs to the hyperbolic type, recently developed by I. Touil and W. Chikouche cite{filomat2021,acta2022}. The complexity analysis for large-update primal-dual IPMs based on this KF yields an $mathcal{O}left( sqrt{n}log^2nlog frac{n}{epsilon }right)$ iteration bound which improves the classical iteration bound. For small-update methods, the proposed algorithm enjoys the favorable iteration bound, namely, $mathcal{O}left( sqrt{n}log frac{n}{epsilon }right)$. We back up these results with some preliminary numerical tests which show that our algorithm outperformed other algorithms with better theoretical convergence complexity. To our knowledge, this is the first feasible primal-dual interior-point algorithm based on an exponential-hyperbolic KF.
{"title":"A Primal-Dual Interior-Point Algorithm Based on a Kernel Function with a New Barrier Term","authors":"Safa Guerdouh, W. Chikouche, Imene Touil","doi":"10.19139/soic-2310-5070-1381","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-1381","url":null,"abstract":"In this paper, we propose a path-following interior-point method (IPM) for solving linear optimization (LO) problems based on a new kernel function (KF). The latter differs from other KFs in having an exponential-hyperbolic barrier term that belongs to the hyperbolic type, recently developed by I. Touil and W. Chikouche cite{filomat2021,acta2022}. The complexity analysis for large-update primal-dual IPMs based on this KF yields an $mathcal{O}left( sqrt{n}log^2nlog frac{n}{epsilon }right)$ iteration bound which improves the classical iteration bound. For small-update methods, the proposed algorithm enjoys the favorable iteration bound, namely, $mathcal{O}left( sqrt{n}log frac{n}{epsilon }right)$. We back up these results with some preliminary numerical tests which show that our algorithm outperformed other algorithms with better theoretical convergence complexity. To our knowledge, this is the first feasible primal-dual interior-point algorithm based on an exponential-hyperbolic KF.","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127405332","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-20DOI: 10.19139/soic-2310-5070-1511
Ehsan Amiri, Ahmad Mosallanejad, Amir Sheikhahmadi
Image forgery detection is a new challenge. One type of image forgery is a copy-move forgery. In this method, part of the image is copied and placed at the most similar point. Given the existing algorithms and processing software, identifying forgery areas is difficult and has created challenges in various applications. The proposed method based on the Equilibrium Optimization Algorithm (EOA) helps image forgery detection by finding forgery areas. The proposed method includes feature detection, image segmentation, and detection of forgery areas using the EOA algorithm. In the first step, the image converts to a grayscale. Then, with the help of a discrete cosine transform (DCT) algorithm, it is taken to the signal domain. With the help of discrete wavelet transform (DWT), its appropriate properties are introduced. In the next step, the image is divided into blocks of equal size. Then the similarity search is performed with the help of an equilibrium optimization algorithm and a suitable proportion function. Copy-move forgery detection using the Equilibrium Optimization Algorithm (CMFDEOA) can find areas of forgery with an accuracy of about 86.21% for the IMD data set and about 83.98% for the MICC-F600 data set.
{"title":"Copy-Move Forgery Detection Using an Equilibrium Optimization Algorithm (CMFDEOA)","authors":"Ehsan Amiri, Ahmad Mosallanejad, Amir Sheikhahmadi","doi":"10.19139/soic-2310-5070-1511","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-1511","url":null,"abstract":"Image forgery detection is a new challenge. One type of image forgery is a copy-move forgery. In this method, part of the image is copied and placed at the most similar point. Given the existing algorithms and processing software, identifying forgery areas is difficult and has created challenges in various applications. The proposed method based on the Equilibrium Optimization Algorithm (EOA) helps image forgery detection by finding forgery areas. The proposed method includes feature detection, image segmentation, and detection of forgery areas using the EOA algorithm. In the first step, the image converts to a grayscale. Then, with the help of a discrete cosine transform (DCT) algorithm, it is taken to the signal domain. With the help of discrete wavelet transform (DWT), its appropriate properties are introduced. In the next step, the image is divided into blocks of equal size. Then the similarity search is performed with the help of an equilibrium optimization algorithm and a suitable proportion function. Copy-move forgery detection using the Equilibrium Optimization Algorithm (CMFDEOA) can find areas of forgery with an accuracy of about 86.21% for the IMD data set and about 83.98% for the MICC-F600 data set.","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"519 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116253755","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-20DOI: 10.19139/soic-2310-5070-1687
Alla Alhamidah, Mehran Naghizadeh Qmi, A. Kiapour
In this paper, we consider the problem of E-Bayesian estimation and its expected posterior mean squared error (E-PMSE) in a Burr type XII model on the basis of record values. The Bayesian and E-Bayesian estimators are computed under different prior distributions for hyperparameters. The E-PMSE of E-Bayesian estimators are calculated in order to measure the estimated risk. Performances of the E-Bayesian estimators are compared using a Monte Carlo simulation. A real data set is analyzed for illustrating the estimation results.
{"title":"Comparison of E-Bayesian Estimators in Burr XII Model Using E-PMSE Based on Record Values","authors":"Alla Alhamidah, Mehran Naghizadeh Qmi, A. Kiapour","doi":"10.19139/soic-2310-5070-1687","DOIUrl":"https://doi.org/10.19139/soic-2310-5070-1687","url":null,"abstract":"In this paper, we consider the problem of E-Bayesian estimation and its expected posterior mean squared error (E-PMSE) in a Burr type XII model on the basis of record values. The Bayesian and E-Bayesian estimators are computed under different prior distributions for hyperparameters. The E-PMSE of E-Bayesian estimators are calculated in order to measure the estimated risk. Performances of the E-Bayesian estimators are compared using a Monte Carlo simulation. A real data set is analyzed for illustrating the estimation results.","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131477650","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}