{"title":"一对改进和扩展条件MLE的新先验","authors":"Takemi Yanagimoto , Yoichi Miyata","doi":"10.1016/j.jspi.2023.106117","DOIUrl":null,"url":null,"abstract":"<div><p><span>A Bayesian estimator aiming at improving the conditional MLE is proposed by introducing a pair of priors. After explaining the conditional MLE by the posterior mode under a prior, we define a promising estimator by the </span>posterior mean<span> under a corresponding prior. The prior is asymptotically equivalent to the reference prior in familiar models. Advantages of the present approach include two different optimality properties of the induced estimator, the ease of various extensions and the possible treatments for a finite sample size. The existing approaches are discussed and critiqued.</span></p></div>","PeriodicalId":50039,"journal":{"name":"Journal of Statistical Planning and Inference","volume":"231 ","pages":"Article 106117"},"PeriodicalIF":0.8000,"publicationDate":"2023-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A pair of novel priors for improving and extending the conditional MLE\",\"authors\":\"Takemi Yanagimoto , Yoichi Miyata\",\"doi\":\"10.1016/j.jspi.2023.106117\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p><span>A Bayesian estimator aiming at improving the conditional MLE is proposed by introducing a pair of priors. After explaining the conditional MLE by the posterior mode under a prior, we define a promising estimator by the </span>posterior mean<span> under a corresponding prior. The prior is asymptotically equivalent to the reference prior in familiar models. Advantages of the present approach include two different optimality properties of the induced estimator, the ease of various extensions and the possible treatments for a finite sample size. The existing approaches are discussed and critiqued.</span></p></div>\",\"PeriodicalId\":50039,\"journal\":{\"name\":\"Journal of Statistical Planning and Inference\",\"volume\":\"231 \",\"pages\":\"Article 106117\"},\"PeriodicalIF\":0.8000,\"publicationDate\":\"2023-11-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Statistical Planning and Inference\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0378375823000861\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Statistical Planning and Inference","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378375823000861","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
A pair of novel priors for improving and extending the conditional MLE
A Bayesian estimator aiming at improving the conditional MLE is proposed by introducing a pair of priors. After explaining the conditional MLE by the posterior mode under a prior, we define a promising estimator by the posterior mean under a corresponding prior. The prior is asymptotically equivalent to the reference prior in familiar models. Advantages of the present approach include two different optimality properties of the induced estimator, the ease of various extensions and the possible treatments for a finite sample size. The existing approaches are discussed and critiqued.
期刊介绍:
The Journal of Statistical Planning and Inference offers itself as a multifaceted and all-inclusive bridge between classical aspects of statistics and probability, and the emerging interdisciplinary aspects that have a potential of revolutionizing the subject. While we maintain our traditional strength in statistical inference, design, classical probability, and large sample methods, we also have a far more inclusive and broadened scope to keep up with the new problems that confront us as statisticians, mathematicians, and scientists.
We publish high quality articles in all branches of statistics, probability, discrete mathematics, machine learning, and bioinformatics. We also especially welcome well written and up to date review articles on fundamental themes of statistics, probability, machine learning, and general biostatistics. Thoughtful letters to the editors, interesting problems in need of a solution, and short notes carrying an element of elegance or beauty are equally welcome.