Pub Date : 2023-12-06DOI: 10.3389/fams.2023.1324054
F. Khan, Yonis Gulzar, Shahnawaz Ayoub, Muneer Majid, Mohammad Shuaib Mir, Arjumand Bano Soomro
Radiologists confront formidable challenges when confronted with the intricate task of classifying brain tumors through the analysis of MRI images. Our forthcoming manuscript introduces an innovative and highly effective methodology that capitalizes on the capabilities of Least Squares Support Vector Machines (LS-SVM) in tandem with the rich insights drawn from Multi-Scale Morphological Texture Features (MMTF) extracted from T1-weighted MR images. Our methodology underwent meticulous evaluation on a substantial dataset encompassing 139 cases, consisting of 119 cases of aberrant tumors and 20 cases of normal brain images. The outcomes we achieved are nothing short of extraordinary. Our LS-SVM-based approach vastly outperforms competing classifiers, demonstrating its dominance with an exceptional accuracy rate of 98.97%. This represents a substantial 3.97% improvement over alternative methods, accompanied by a notable 2.48% enhancement in Sensitivity and a substantial 10% increase in Specificity. These results conclusively surpass the performance of traditional classifiers such as Support Vector Machines (SVM), Radial Basis Function (RBF), and Artificial Neural Networks (ANN) in terms of classification accuracy. The outstanding performance of our model in the realm of brain tumor diagnosis signifies a substantial leap forward in the field, holding the promise of delivering more precise and dependable tools for radiologists and healthcare professionals in their pivotal role of identifying and classifying brain tumors using MRI imaging techniques.
{"title":"Least square-support vector machine based brain tumor classification system with multi model texture features","authors":"F. Khan, Yonis Gulzar, Shahnawaz Ayoub, Muneer Majid, Mohammad Shuaib Mir, Arjumand Bano Soomro","doi":"10.3389/fams.2023.1324054","DOIUrl":"https://doi.org/10.3389/fams.2023.1324054","url":null,"abstract":"Radiologists confront formidable challenges when confronted with the intricate task of classifying brain tumors through the analysis of MRI images. Our forthcoming manuscript introduces an innovative and highly effective methodology that capitalizes on the capabilities of Least Squares Support Vector Machines (LS-SVM) in tandem with the rich insights drawn from Multi-Scale Morphological Texture Features (MMTF) extracted from T1-weighted MR images. Our methodology underwent meticulous evaluation on a substantial dataset encompassing 139 cases, consisting of 119 cases of aberrant tumors and 20 cases of normal brain images. The outcomes we achieved are nothing short of extraordinary. Our LS-SVM-based approach vastly outperforms competing classifiers, demonstrating its dominance with an exceptional accuracy rate of 98.97%. This represents a substantial 3.97% improvement over alternative methods, accompanied by a notable 2.48% enhancement in Sensitivity and a substantial 10% increase in Specificity. These results conclusively surpass the performance of traditional classifiers such as Support Vector Machines (SVM), Radial Basis Function (RBF), and Artificial Neural Networks (ANN) in terms of classification accuracy. The outstanding performance of our model in the realm of brain tumor diagnosis signifies a substantial leap forward in the field, holding the promise of delivering more precise and dependable tools for radiologists and healthcare professionals in their pivotal role of identifying and classifying brain tumors using MRI imaging techniques.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"91 12","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138596105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-05DOI: 10.3389/fams.2023.1275588
Zhitao Wang, Nana Li, Quan Zhang, Jin Wei, Lei Zhang, Yuanquan Wang
The active contour model, also known as the snake model, is an elegant approach for image segmentation and motion tracking. The gradient vector flow (GVF) is an effective external force for active contours. However, the GVF model is based on isotropic diffusion and does not take the image structure into account. The GVF snake cannot converge to very deep concavities and blob-like concavities and fails to preserve weak edges neighboring strong ones. To address these limitations, we first propose the directionally weakened diffusion (DWD), which is anisotropic by incorporating the image structure in a subtle way. Using the DWD, a novel external force called directionally weakened gradient vector flow (DWGVF) is proposed for active contours. In addition, two spatiotemporally varying weights are employed to make the DWGVF robust to noise. The DWGVF snake has been assessed on both synthetic and real images. Experimental results show that the DWGVF snake provides much better results in terms of noise robustness, weak edge preserving, and convergence of various concavities when compared with the well-known GVF, the generalized GVF (GGVF) snake.
{"title":"Directionally weakened diffusion for image segmentation using active contours","authors":"Zhitao Wang, Nana Li, Quan Zhang, Jin Wei, Lei Zhang, Yuanquan Wang","doi":"10.3389/fams.2023.1275588","DOIUrl":"https://doi.org/10.3389/fams.2023.1275588","url":null,"abstract":"The active contour model, also known as the snake model, is an elegant approach for image segmentation and motion tracking. The gradient vector flow (GVF) is an effective external force for active contours. However, the GVF model is based on isotropic diffusion and does not take the image structure into account. The GVF snake cannot converge to very deep concavities and blob-like concavities and fails to preserve weak edges neighboring strong ones. To address these limitations, we first propose the directionally weakened diffusion (DWD), which is anisotropic by incorporating the image structure in a subtle way. Using the DWD, a novel external force called directionally weakened gradient vector flow (DWGVF) is proposed for active contours. In addition, two spatiotemporally varying weights are employed to make the DWGVF robust to noise. The DWGVF snake has been assessed on both synthetic and real images. Experimental results show that the DWGVF snake provides much better results in terms of noise robustness, weak edge preserving, and convergence of various concavities when compared with the well-known GVF, the generalized GVF (GGVF) snake.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"54 3","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138600477","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The main goal of this study is to examine the return explanation strengths of the Carhart four-factor, the Fama–French three-factor, and the single-factor models in the context of the Bangladeshi stock market. We, therefore, reveal the risk-adjusted returns, test the valuation capability of multi-factor models, and estimate optimal portfolio weights of stocks listed in DSE under the DSE30 index. Our findings demonstrate that large capitalization firms that have low or medium book-to-market (B/M) ratios produce more concentrated returns than their counterparts, resulting in greater earnings per unit of total, systematic, and downside risks. Furthermore, we discover that each factorial value has an impressive capacity to explain the market excess returns; however, the influence of factor values on the cross-section of stock returns is somewhat contradictory. In particular, the momentum factor is unable to describe the cross-section excess returns, whereas the risk premium, size, and value factors have a significant impact on the cross-section excess returns. Finally, we find that a large-cap firm with a low B/M ratio is suitable for risk-seeking investors; in contrast, a small-cap firm with a low B/M ratio is appropriate for lower risk tolerance investors. Moreover, our empirical outcomes have noteworthy implications for private companies, investors, and policymakers.
{"title":"Portfolio optimization and valuation capability of multi-factor models: an observational evidence from Dhaka stock exchange","authors":"Md. Ahsan Kabir, Liping Yu, Sanjoy Kumar Sarker, Md. Nahiduzzaman, Tanmay Borman","doi":"10.3389/fams.2023.1271485","DOIUrl":"https://doi.org/10.3389/fams.2023.1271485","url":null,"abstract":"The main goal of this study is to examine the return explanation strengths of the Carhart four-factor, the Fama–French three-factor, and the single-factor models in the context of the Bangladeshi stock market. We, therefore, reveal the risk-adjusted returns, test the valuation capability of multi-factor models, and estimate optimal portfolio weights of stocks listed in DSE under the DSE30 index. Our findings demonstrate that large capitalization firms that have low or medium book-to-market (B/M) ratios produce more concentrated returns than their counterparts, resulting in greater earnings per unit of total, systematic, and downside risks. Furthermore, we discover that each factorial value has an impressive capacity to explain the market excess returns; however, the influence of factor values on the cross-section of stock returns is somewhat contradictory. In particular, the momentum factor is unable to describe the cross-section excess returns, whereas the risk premium, size, and value factors have a significant impact on the cross-section excess returns. Finally, we find that a large-cap firm with a low B/M ratio is suitable for risk-seeking investors; in contrast, a small-cap firm with a low B/M ratio is appropriate for lower risk tolerance investors. Moreover, our empirical outcomes have noteworthy implications for private companies, investors, and policymakers.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"124 51","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138599197","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-11-13DOI: 10.3389/fams.2023.1267034
Ori Becher, Mira Marcus-Kalish, David M. Steinberg
The age of big data has fueled expectations for accelerating learning. The availability of large data sets enables researchers to achieve more powerful statistical analyses and enhances the reliability of conclusions, which can be based on a broad collection of subjects. Often such data sets can be assembled only with access to diverse sources; for example, medical research that combines data from multiple centers in a federated analysis. However these hopes must be balanced against data privacy concerns, which hinder sharing raw data among centers. Consequently, federated analyses typically resort to sharing data summaries from each center. The limitation to summaries carries the risk that it will impair the efficiency of statistical analysis procedures. In this work, we take a close look at the effects of federated analysis on two very basic problems, non-parametric comparison of two groups and quantile estimation to describe the corresponding distributions. We also propose a specific privacy-preserving data release policy for federated analysis with the K -anonymity criterion, which has been adopted by the Medical Informatics Platform of the European Human Brain Project. Our results show that, for our tasks, there is only a modest loss of statistical efficiency.
{"title":"Federated statistical analysis: non-parametric testing and quantile estimation","authors":"Ori Becher, Mira Marcus-Kalish, David M. Steinberg","doi":"10.3389/fams.2023.1267034","DOIUrl":"https://doi.org/10.3389/fams.2023.1267034","url":null,"abstract":"The age of big data has fueled expectations for accelerating learning. The availability of large data sets enables researchers to achieve more powerful statistical analyses and enhances the reliability of conclusions, which can be based on a broad collection of subjects. Often such data sets can be assembled only with access to diverse sources; for example, medical research that combines data from multiple centers in a federated analysis. However these hopes must be balanced against data privacy concerns, which hinder sharing raw data among centers. Consequently, federated analyses typically resort to sharing data summaries from each center. The limitation to summaries carries the risk that it will impair the efficiency of statistical analysis procedures. In this work, we take a close look at the effects of federated analysis on two very basic problems, non-parametric comparison of two groups and quantile estimation to describe the corresponding distributions. We also propose a specific privacy-preserving data release policy for federated analysis with the K -anonymity criterion, which has been adopted by the Medical Informatics Platform of the European Human Brain Project. Our results show that, for our tasks, there is only a modest loss of statistical efficiency.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"47 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136352045","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-11-08DOI: 10.3389/fams.2023.1265254
Joerg Osterrieder, Michael Seigne
This literature review aims to address the critical knowledge gap in the field of share repurchase executions, a financial activity involving companies repurchasing trillions of dollars' worth of their own shares. The significance of understanding these mechanisms and their impact is underscored by their potential influence on the global economy. The paper employs a comprehensive analysis of existing literature, focusing on share repurchase mechanisms and motivations. It scrutinizes both open-market repurchases and Accelerated Share Repurchase contracts. Methodological approaches in current research, such as the use of partial differential equations and tree methods, are also evaluated. The review reveals that the execution phase of share repurchases remains largely unexplored. Unanswered questions persist about trading schedules, implications, costs, broker and corporate performance, and psychological effects of beating a buyback benchmark. Additionally, the review identifies significant limitations in current research methodologies. The paper advocates for the application and development of more advanced tools like machine learning and artificial intelligence to address these gaps. It also suggests potential areas for future research, including the role of technology in share repurchase execution, psychological factors influencing corporate buybacks, and the development of performance metrics for brokers and corporations. The review serves not only to highlight existing gaps in literature but also to suggest avenues for future research that could fundamentally enhance our understanding of share repurchase executions. JEL classification G1, G12, G14, G02, G4.
{"title":"Examining share repurchase executions: insights and synthesis from the existing literature","authors":"Joerg Osterrieder, Michael Seigne","doi":"10.3389/fams.2023.1265254","DOIUrl":"https://doi.org/10.3389/fams.2023.1265254","url":null,"abstract":"This literature review aims to address the critical knowledge gap in the field of share repurchase executions, a financial activity involving companies repurchasing trillions of dollars' worth of their own shares. The significance of understanding these mechanisms and their impact is underscored by their potential influence on the global economy. The paper employs a comprehensive analysis of existing literature, focusing on share repurchase mechanisms and motivations. It scrutinizes both open-market repurchases and Accelerated Share Repurchase contracts. Methodological approaches in current research, such as the use of partial differential equations and tree methods, are also evaluated. The review reveals that the execution phase of share repurchases remains largely unexplored. Unanswered questions persist about trading schedules, implications, costs, broker and corporate performance, and psychological effects of beating a buyback benchmark. Additionally, the review identifies significant limitations in current research methodologies. The paper advocates for the application and development of more advanced tools like machine learning and artificial intelligence to address these gaps. It also suggests potential areas for future research, including the role of technology in share repurchase execution, psychological factors influencing corporate buybacks, and the development of performance metrics for brokers and corporations. The review serves not only to highlight existing gaps in literature but also to suggest avenues for future research that could fundamentally enhance our understanding of share repurchase executions. JEL classification G1, G12, G14, G02, G4.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"358 12","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135392991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article aimed to present a new continuous probability density function for a non-negative random variable that serves as an alternative to some bounded domain distributions. The new distribution, termed the log-Kumaraswamy distribution, could faithfully be employed to compete with bounded and unbounded random processes. Some essential features of this distribution were studied, and the parameters of its estimates were obtained based on the maximum product of spacing, least squares, and weighted least squares procedures. The new distribution was proven to be better than traditional models in terms of flexibility and applicability to real-life data sets.
{"title":"Log-Kumaraswamy distribution: its features and applications","authors":"Aliyu Ismail Ishaq, Ahmad Abubakar Suleiman, Hanita Daud, Narinderjit Singh Sawaran Singh, Mahmod Othman, Rajalingam Sokkalingam, Pitchaya Wiratchotisatian, Abdullahi Garba Usman, Sani Isah Abba","doi":"10.3389/fams.2023.1258961","DOIUrl":"https://doi.org/10.3389/fams.2023.1258961","url":null,"abstract":"This article aimed to present a new continuous probability density function for a non-negative random variable that serves as an alternative to some bounded domain distributions. The new distribution, termed the log-Kumaraswamy distribution, could faithfully be employed to compete with bounded and unbounded random processes. Some essential features of this distribution were studied, and the parameters of its estimates were obtained based on the maximum product of spacing, least squares, and weighted least squares procedures. The new distribution was proven to be better than traditional models in terms of flexibility and applicability to real-life data sets.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135870098","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tuberculosis is a major health problem that contributes significantly to infectious disease mortality worldwide. A new challenge for society that demands extensive work toward implementing the right control strategies for Tuberculosis (TB) is the emergence of drug-resistant TB. In this study, we developed a mathematical model to investigate the effect of chemoprophylaxis treatment on the transmission of tuberculosis with the drug-resistant compartment. An analysis of stabilities is performed along with an investigation into the possibility of endemic and disease-free equilibrium. The qualitative outcome of the model analysis shows that Disease Free Equilibrium (DFE) is locally asymptotically stable for R 0 < 1, but the endemic equilibrium becomes globally asymptotically stable for R 0 > 1. A bifurcation analysis was performed using the center manifold theorem, and it was found that the model shows evidence of forward bifurcation. Furthermore, the sensitivity analysis of the model was thoroughly carried out, and numerical simulation was also performed. This study showed that administering chemoprophylaxis treatment to individuals with latent infections significantly reduces the progression of exposed individuals to the infectious and drug-resistant classes, ultimately leading to a reduction in the transmission of the disease at large.
{"title":"Modeling and bifurcation analysis of tuberculosis with the multidrug-resistant compartment incorporating chemoprophylaxis treatment","authors":"Damtew Bewket Kitaro, Boka Kumsa Bole, Koya Purnachandra Rao","doi":"10.3389/fams.2023.1264201","DOIUrl":"https://doi.org/10.3389/fams.2023.1264201","url":null,"abstract":"Tuberculosis is a major health problem that contributes significantly to infectious disease mortality worldwide. A new challenge for society that demands extensive work toward implementing the right control strategies for Tuberculosis (TB) is the emergence of drug-resistant TB. In this study, we developed a mathematical model to investigate the effect of chemoprophylaxis treatment on the transmission of tuberculosis with the drug-resistant compartment. An analysis of stabilities is performed along with an investigation into the possibility of endemic and disease-free equilibrium. The qualitative outcome of the model analysis shows that Disease Free Equilibrium (DFE) is locally asymptotically stable for R 0 &lt; 1, but the endemic equilibrium becomes globally asymptotically stable for R 0 &gt; 1. A bifurcation analysis was performed using the center manifold theorem, and it was found that the model shows evidence of forward bifurcation. Furthermore, the sensitivity analysis of the model was thoroughly carried out, and numerical simulation was also performed. This study showed that administering chemoprophylaxis treatment to individuals with latent infections significantly reduces the progression of exposed individuals to the infectious and drug-resistant classes, ultimately leading to a reduction in the transmission of the disease at large.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"2016 22","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135813831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-30DOI: 10.3389/fams.2023.1272334
Jennifer L. Delzeit, Devin C. Koestler
Numerous methods and approaches have been developed for generating time-to-event data from the Cox Proportional Hazards (CPH) model; however, they often require specification of a parametric distribution for the baseline hazard even though the CPH model itself makes no assumptions on the distribution of the baseline hazards. In line with the semi-parametric nature of the CPH model, a recently proposed method called the Flexible Hazards Method generates time-to-event data from a CPH model using a non-parametric baseline hazard function. While the initial results of this method are promising, it has not yet been comprehensively assessed with increasing covariates or against data generated under parametric baseline hazards. To fill this gap, we conducted a comprehensive study to benchmark the performance of the Flexible Hazards Method for generating data from a CPH model against parametric methods. Our results showed that with a single covariate and large enough assumed maximum time, the bias in the Flexible Hazards Method is 0.02 (with respect to the log hazard ratio) with a 95% confidence interval having coverage of 84.4%. This bias increases to 0.054 when there are 10 covariates under the same settings and the coverage of the 95% confidence interval decreases to 46.7%. In this paper, we explain the plausible reasons for this observed increase in bias and decrease in coverage as the number of covariates are increased, both empirically and theoretically, and provide readers and potential users of this method with some suggestions on how to best address these issues. In summary, the Flexible Hazards Method performs well when there are few covariates and the user wishes to simulate data from a non-parametric baseline hazard.
{"title":"Simulating time-to-event data under the Cox proportional hazards model: assessing the performance of the non-parametric Flexible Hazards Method","authors":"Jennifer L. Delzeit, Devin C. Koestler","doi":"10.3389/fams.2023.1272334","DOIUrl":"https://doi.org/10.3389/fams.2023.1272334","url":null,"abstract":"Numerous methods and approaches have been developed for generating time-to-event data from the Cox Proportional Hazards (CPH) model; however, they often require specification of a parametric distribution for the baseline hazard even though the CPH model itself makes no assumptions on the distribution of the baseline hazards. In line with the semi-parametric nature of the CPH model, a recently proposed method called the Flexible Hazards Method generates time-to-event data from a CPH model using a non-parametric baseline hazard function. While the initial results of this method are promising, it has not yet been comprehensively assessed with increasing covariates or against data generated under parametric baseline hazards. To fill this gap, we conducted a comprehensive study to benchmark the performance of the Flexible Hazards Method for generating data from a CPH model against parametric methods. Our results showed that with a single covariate and large enough assumed maximum time, the bias in the Flexible Hazards Method is 0.02 (with respect to the log hazard ratio) with a 95% confidence interval having coverage of 84.4%. This bias increases to 0.054 when there are 10 covariates under the same settings and the coverage of the 95% confidence interval decreases to 46.7%. In this paper, we explain the plausible reasons for this observed increase in bias and decrease in coverage as the number of covariates are increased, both empirically and theoretically, and provide readers and potential users of this method with some suggestions on how to best address these issues. In summary, the Flexible Hazards Method performs well when there are few covariates and the user wishes to simulate data from a non-parametric baseline hazard.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"45 5","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136067782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-24DOI: 10.3389/fams.2023.1142625
Belthasara Assan, Farai Nyabadza
From the beginning of the outbreak of SARS-CoV-2 (COVID-19), South African data depicted seasonal transmission patterns, with infections rising in summer and winter every year. Seasonality, control measures, and the role of the environment are the most important factors in periodic epidemics. In this study, a deterministic model incorporating the influences of seasonality, vaccination, and the role of the environment is formulated to determine how these factors impact the epidemic. We analyzed the stability of the model, demonstrating that when R 0 < 1, the disease-free equilibrium is globally symptomatically stable, whereas R 0 > 1 indicates that the disease uniformly persists and at least one positive periodic solution exists. We demonstrate its application by using the data reported by the National Institute for Communicable Diseases. We fitted our mathematical model to the data from the third wave to the fifth wave and used a damping effect due to mandatory vaccination in the fifth wave. Our analytical and numerical results indicate that different efficacies for vaccination have a different influence on epidemic transmission at different seasonal periods. Our findings also indicate that as long as the coronavirus persists in the environment, the epidemic will continue to affect the human population and disease control should be geared toward the environment.
{"title":"A COVID-19 epidemic model with periodicity in transmission and environmental dynamics","authors":"Belthasara Assan, Farai Nyabadza","doi":"10.3389/fams.2023.1142625","DOIUrl":"https://doi.org/10.3389/fams.2023.1142625","url":null,"abstract":"From the beginning of the outbreak of SARS-CoV-2 (COVID-19), South African data depicted seasonal transmission patterns, with infections rising in summer and winter every year. Seasonality, control measures, and the role of the environment are the most important factors in periodic epidemics. In this study, a deterministic model incorporating the influences of seasonality, vaccination, and the role of the environment is formulated to determine how these factors impact the epidemic. We analyzed the stability of the model, demonstrating that when R 0 &lt; 1, the disease-free equilibrium is globally symptomatically stable, whereas R 0 &gt; 1 indicates that the disease uniformly persists and at least one positive periodic solution exists. We demonstrate its application by using the data reported by the National Institute for Communicable Diseases. We fitted our mathematical model to the data from the third wave to the fifth wave and used a damping effect due to mandatory vaccination in the fifth wave. Our analytical and numerical results indicate that different efficacies for vaccination have a different influence on epidemic transmission at different seasonal periods. Our findings also indicate that as long as the coronavirus persists in the environment, the epidemic will continue to affect the human population and disease control should be geared toward the environment.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135315657","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-19DOI: 10.3389/fams.2023.1279638
Muhammad Aslam, Florentin Smarandache
In this paper, we propose the introduction of a neutrosophic chi-square-test for consistency, incorporating neutrosophic statistics. Our aim is to modify the existing chi-square -test for consistency in order to analyze imprecise data. We present a novel test statistic for the neutrosophic chi-square -test for consistency, which accounts for the uncertainties inherent in the data. To evaluate the performance of the proposed test, we compare it with the traditional chi-square -test for consistency based on classical statistics. By conducting a comparative analysis, we assess the efficiency and effectiveness of our proposed neutrosophic chi-square -test for consistency. Furthermore, we illustrate the application of the proposed test through a numerical example, demonstrating how it can be utilized in practical scenarios. Through this implementation, we aim to provide empirical evidence of the improved performance of our proposed test when compared to the traditional chi-square-test for consistency based on classical statistics. We anticipate that the proposed neutrosophic chi-square -test for consistency will outperform its classical counterpart, offering enhanced accuracy and reliability when dealing with imprecise data. This advancement has the potential to contribute significantly to the field of statistical analysis, particularly in situations where data uncertainty and imprecision are prevalent.
{"title":"Chi-square test for imprecise data in consistency table","authors":"Muhammad Aslam, Florentin Smarandache","doi":"10.3389/fams.2023.1279638","DOIUrl":"https://doi.org/10.3389/fams.2023.1279638","url":null,"abstract":"In this paper, we propose the introduction of a neutrosophic chi-square-test for consistency, incorporating neutrosophic statistics. Our aim is to modify the existing chi-square -test for consistency in order to analyze imprecise data. We present a novel test statistic for the neutrosophic chi-square -test for consistency, which accounts for the uncertainties inherent in the data. To evaluate the performance of the proposed test, we compare it with the traditional chi-square -test for consistency based on classical statistics. By conducting a comparative analysis, we assess the efficiency and effectiveness of our proposed neutrosophic chi-square -test for consistency. Furthermore, we illustrate the application of the proposed test through a numerical example, demonstrating how it can be utilized in practical scenarios. Through this implementation, we aim to provide empirical evidence of the improved performance of our proposed test when compared to the traditional chi-square-test for consistency based on classical statistics. We anticipate that the proposed neutrosophic chi-square -test for consistency will outperform its classical counterpart, offering enhanced accuracy and reliability when dealing with imprecise data. This advancement has the potential to contribute significantly to the field of statistical analysis, particularly in situations where data uncertainty and imprecision are prevalent.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135730104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}