Pub Date : 2023-11-08DOI: 10.3389/fams.2023.1265254
Joerg Osterrieder, Michael Seigne
This literature review aims to address the critical knowledge gap in the field of share repurchase executions, a financial activity involving companies repurchasing trillions of dollars' worth of their own shares. The significance of understanding these mechanisms and their impact is underscored by their potential influence on the global economy. The paper employs a comprehensive analysis of existing literature, focusing on share repurchase mechanisms and motivations. It scrutinizes both open-market repurchases and Accelerated Share Repurchase contracts. Methodological approaches in current research, such as the use of partial differential equations and tree methods, are also evaluated. The review reveals that the execution phase of share repurchases remains largely unexplored. Unanswered questions persist about trading schedules, implications, costs, broker and corporate performance, and psychological effects of beating a buyback benchmark. Additionally, the review identifies significant limitations in current research methodologies. The paper advocates for the application and development of more advanced tools like machine learning and artificial intelligence to address these gaps. It also suggests potential areas for future research, including the role of technology in share repurchase execution, psychological factors influencing corporate buybacks, and the development of performance metrics for brokers and corporations. The review serves not only to highlight existing gaps in literature but also to suggest avenues for future research that could fundamentally enhance our understanding of share repurchase executions. JEL classification G1, G12, G14, G02, G4.
{"title":"Examining share repurchase executions: insights and synthesis from the existing literature","authors":"Joerg Osterrieder, Michael Seigne","doi":"10.3389/fams.2023.1265254","DOIUrl":"https://doi.org/10.3389/fams.2023.1265254","url":null,"abstract":"This literature review aims to address the critical knowledge gap in the field of share repurchase executions, a financial activity involving companies repurchasing trillions of dollars' worth of their own shares. The significance of understanding these mechanisms and their impact is underscored by their potential influence on the global economy. The paper employs a comprehensive analysis of existing literature, focusing on share repurchase mechanisms and motivations. It scrutinizes both open-market repurchases and Accelerated Share Repurchase contracts. Methodological approaches in current research, such as the use of partial differential equations and tree methods, are also evaluated. The review reveals that the execution phase of share repurchases remains largely unexplored. Unanswered questions persist about trading schedules, implications, costs, broker and corporate performance, and psychological effects of beating a buyback benchmark. Additionally, the review identifies significant limitations in current research methodologies. The paper advocates for the application and development of more advanced tools like machine learning and artificial intelligence to address these gaps. It also suggests potential areas for future research, including the role of technology in share repurchase execution, psychological factors influencing corporate buybacks, and the development of performance metrics for brokers and corporations. The review serves not only to highlight existing gaps in literature but also to suggest avenues for future research that could fundamentally enhance our understanding of share repurchase executions. JEL classification G1, G12, G14, G02, G4.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"358 12","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135392991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article aimed to present a new continuous probability density function for a non-negative random variable that serves as an alternative to some bounded domain distributions. The new distribution, termed the log-Kumaraswamy distribution, could faithfully be employed to compete with bounded and unbounded random processes. Some essential features of this distribution were studied, and the parameters of its estimates were obtained based on the maximum product of spacing, least squares, and weighted least squares procedures. The new distribution was proven to be better than traditional models in terms of flexibility and applicability to real-life data sets.
{"title":"Log-Kumaraswamy distribution: its features and applications","authors":"Aliyu Ismail Ishaq, Ahmad Abubakar Suleiman, Hanita Daud, Narinderjit Singh Sawaran Singh, Mahmod Othman, Rajalingam Sokkalingam, Pitchaya Wiratchotisatian, Abdullahi Garba Usman, Sani Isah Abba","doi":"10.3389/fams.2023.1258961","DOIUrl":"https://doi.org/10.3389/fams.2023.1258961","url":null,"abstract":"This article aimed to present a new continuous probability density function for a non-negative random variable that serves as an alternative to some bounded domain distributions. The new distribution, termed the log-Kumaraswamy distribution, could faithfully be employed to compete with bounded and unbounded random processes. Some essential features of this distribution were studied, and the parameters of its estimates were obtained based on the maximum product of spacing, least squares, and weighted least squares procedures. The new distribution was proven to be better than traditional models in terms of flexibility and applicability to real-life data sets.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135870098","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tuberculosis is a major health problem that contributes significantly to infectious disease mortality worldwide. A new challenge for society that demands extensive work toward implementing the right control strategies for Tuberculosis (TB) is the emergence of drug-resistant TB. In this study, we developed a mathematical model to investigate the effect of chemoprophylaxis treatment on the transmission of tuberculosis with the drug-resistant compartment. An analysis of stabilities is performed along with an investigation into the possibility of endemic and disease-free equilibrium. The qualitative outcome of the model analysis shows that Disease Free Equilibrium (DFE) is locally asymptotically stable for R 0 < 1, but the endemic equilibrium becomes globally asymptotically stable for R 0 > 1. A bifurcation analysis was performed using the center manifold theorem, and it was found that the model shows evidence of forward bifurcation. Furthermore, the sensitivity analysis of the model was thoroughly carried out, and numerical simulation was also performed. This study showed that administering chemoprophylaxis treatment to individuals with latent infections significantly reduces the progression of exposed individuals to the infectious and drug-resistant classes, ultimately leading to a reduction in the transmission of the disease at large.
{"title":"Modeling and bifurcation analysis of tuberculosis with the multidrug-resistant compartment incorporating chemoprophylaxis treatment","authors":"Damtew Bewket Kitaro, Boka Kumsa Bole, Koya Purnachandra Rao","doi":"10.3389/fams.2023.1264201","DOIUrl":"https://doi.org/10.3389/fams.2023.1264201","url":null,"abstract":"Tuberculosis is a major health problem that contributes significantly to infectious disease mortality worldwide. A new challenge for society that demands extensive work toward implementing the right control strategies for Tuberculosis (TB) is the emergence of drug-resistant TB. In this study, we developed a mathematical model to investigate the effect of chemoprophylaxis treatment on the transmission of tuberculosis with the drug-resistant compartment. An analysis of stabilities is performed along with an investigation into the possibility of endemic and disease-free equilibrium. The qualitative outcome of the model analysis shows that Disease Free Equilibrium (DFE) is locally asymptotically stable for R 0 &lt; 1, but the endemic equilibrium becomes globally asymptotically stable for R 0 &gt; 1. A bifurcation analysis was performed using the center manifold theorem, and it was found that the model shows evidence of forward bifurcation. Furthermore, the sensitivity analysis of the model was thoroughly carried out, and numerical simulation was also performed. This study showed that administering chemoprophylaxis treatment to individuals with latent infections significantly reduces the progression of exposed individuals to the infectious and drug-resistant classes, ultimately leading to a reduction in the transmission of the disease at large.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"2016 22","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135813831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-30DOI: 10.3389/fams.2023.1272334
Jennifer L. Delzeit, Devin C. Koestler
Numerous methods and approaches have been developed for generating time-to-event data from the Cox Proportional Hazards (CPH) model; however, they often require specification of a parametric distribution for the baseline hazard even though the CPH model itself makes no assumptions on the distribution of the baseline hazards. In line with the semi-parametric nature of the CPH model, a recently proposed method called the Flexible Hazards Method generates time-to-event data from a CPH model using a non-parametric baseline hazard function. While the initial results of this method are promising, it has not yet been comprehensively assessed with increasing covariates or against data generated under parametric baseline hazards. To fill this gap, we conducted a comprehensive study to benchmark the performance of the Flexible Hazards Method for generating data from a CPH model against parametric methods. Our results showed that with a single covariate and large enough assumed maximum time, the bias in the Flexible Hazards Method is 0.02 (with respect to the log hazard ratio) with a 95% confidence interval having coverage of 84.4%. This bias increases to 0.054 when there are 10 covariates under the same settings and the coverage of the 95% confidence interval decreases to 46.7%. In this paper, we explain the plausible reasons for this observed increase in bias and decrease in coverage as the number of covariates are increased, both empirically and theoretically, and provide readers and potential users of this method with some suggestions on how to best address these issues. In summary, the Flexible Hazards Method performs well when there are few covariates and the user wishes to simulate data from a non-parametric baseline hazard.
{"title":"Simulating time-to-event data under the Cox proportional hazards model: assessing the performance of the non-parametric Flexible Hazards Method","authors":"Jennifer L. Delzeit, Devin C. Koestler","doi":"10.3389/fams.2023.1272334","DOIUrl":"https://doi.org/10.3389/fams.2023.1272334","url":null,"abstract":"Numerous methods and approaches have been developed for generating time-to-event data from the Cox Proportional Hazards (CPH) model; however, they often require specification of a parametric distribution for the baseline hazard even though the CPH model itself makes no assumptions on the distribution of the baseline hazards. In line with the semi-parametric nature of the CPH model, a recently proposed method called the Flexible Hazards Method generates time-to-event data from a CPH model using a non-parametric baseline hazard function. While the initial results of this method are promising, it has not yet been comprehensively assessed with increasing covariates or against data generated under parametric baseline hazards. To fill this gap, we conducted a comprehensive study to benchmark the performance of the Flexible Hazards Method for generating data from a CPH model against parametric methods. Our results showed that with a single covariate and large enough assumed maximum time, the bias in the Flexible Hazards Method is 0.02 (with respect to the log hazard ratio) with a 95% confidence interval having coverage of 84.4%. This bias increases to 0.054 when there are 10 covariates under the same settings and the coverage of the 95% confidence interval decreases to 46.7%. In this paper, we explain the plausible reasons for this observed increase in bias and decrease in coverage as the number of covariates are increased, both empirically and theoretically, and provide readers and potential users of this method with some suggestions on how to best address these issues. In summary, the Flexible Hazards Method performs well when there are few covariates and the user wishes to simulate data from a non-parametric baseline hazard.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"45 5","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136067782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-24DOI: 10.3389/fams.2023.1142625
Belthasara Assan, Farai Nyabadza
From the beginning of the outbreak of SARS-CoV-2 (COVID-19), South African data depicted seasonal transmission patterns, with infections rising in summer and winter every year. Seasonality, control measures, and the role of the environment are the most important factors in periodic epidemics. In this study, a deterministic model incorporating the influences of seasonality, vaccination, and the role of the environment is formulated to determine how these factors impact the epidemic. We analyzed the stability of the model, demonstrating that when R 0 < 1, the disease-free equilibrium is globally symptomatically stable, whereas R 0 > 1 indicates that the disease uniformly persists and at least one positive periodic solution exists. We demonstrate its application by using the data reported by the National Institute for Communicable Diseases. We fitted our mathematical model to the data from the third wave to the fifth wave and used a damping effect due to mandatory vaccination in the fifth wave. Our analytical and numerical results indicate that different efficacies for vaccination have a different influence on epidemic transmission at different seasonal periods. Our findings also indicate that as long as the coronavirus persists in the environment, the epidemic will continue to affect the human population and disease control should be geared toward the environment.
{"title":"A COVID-19 epidemic model with periodicity in transmission and environmental dynamics","authors":"Belthasara Assan, Farai Nyabadza","doi":"10.3389/fams.2023.1142625","DOIUrl":"https://doi.org/10.3389/fams.2023.1142625","url":null,"abstract":"From the beginning of the outbreak of SARS-CoV-2 (COVID-19), South African data depicted seasonal transmission patterns, with infections rising in summer and winter every year. Seasonality, control measures, and the role of the environment are the most important factors in periodic epidemics. In this study, a deterministic model incorporating the influences of seasonality, vaccination, and the role of the environment is formulated to determine how these factors impact the epidemic. We analyzed the stability of the model, demonstrating that when R 0 &lt; 1, the disease-free equilibrium is globally symptomatically stable, whereas R 0 &gt; 1 indicates that the disease uniformly persists and at least one positive periodic solution exists. We demonstrate its application by using the data reported by the National Institute for Communicable Diseases. We fitted our mathematical model to the data from the third wave to the fifth wave and used a damping effect due to mandatory vaccination in the fifth wave. Our analytical and numerical results indicate that different efficacies for vaccination have a different influence on epidemic transmission at different seasonal periods. Our findings also indicate that as long as the coronavirus persists in the environment, the epidemic will continue to affect the human population and disease control should be geared toward the environment.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135315657","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-19DOI: 10.3389/fams.2023.1279638
Muhammad Aslam, Florentin Smarandache
In this paper, we propose the introduction of a neutrosophic chi-square-test for consistency, incorporating neutrosophic statistics. Our aim is to modify the existing chi-square -test for consistency in order to analyze imprecise data. We present a novel test statistic for the neutrosophic chi-square -test for consistency, which accounts for the uncertainties inherent in the data. To evaluate the performance of the proposed test, we compare it with the traditional chi-square -test for consistency based on classical statistics. By conducting a comparative analysis, we assess the efficiency and effectiveness of our proposed neutrosophic chi-square -test for consistency. Furthermore, we illustrate the application of the proposed test through a numerical example, demonstrating how it can be utilized in practical scenarios. Through this implementation, we aim to provide empirical evidence of the improved performance of our proposed test when compared to the traditional chi-square-test for consistency based on classical statistics. We anticipate that the proposed neutrosophic chi-square -test for consistency will outperform its classical counterpart, offering enhanced accuracy and reliability when dealing with imprecise data. This advancement has the potential to contribute significantly to the field of statistical analysis, particularly in situations where data uncertainty and imprecision are prevalent.
{"title":"Chi-square test for imprecise data in consistency table","authors":"Muhammad Aslam, Florentin Smarandache","doi":"10.3389/fams.2023.1279638","DOIUrl":"https://doi.org/10.3389/fams.2023.1279638","url":null,"abstract":"In this paper, we propose the introduction of a neutrosophic chi-square-test for consistency, incorporating neutrosophic statistics. Our aim is to modify the existing chi-square -test for consistency in order to analyze imprecise data. We present a novel test statistic for the neutrosophic chi-square -test for consistency, which accounts for the uncertainties inherent in the data. To evaluate the performance of the proposed test, we compare it with the traditional chi-square -test for consistency based on classical statistics. By conducting a comparative analysis, we assess the efficiency and effectiveness of our proposed neutrosophic chi-square -test for consistency. Furthermore, we illustrate the application of the proposed test through a numerical example, demonstrating how it can be utilized in practical scenarios. Through this implementation, we aim to provide empirical evidence of the improved performance of our proposed test when compared to the traditional chi-square-test for consistency based on classical statistics. We anticipate that the proposed neutrosophic chi-square -test for consistency will outperform its classical counterpart, offering enhanced accuracy and reliability when dealing with imprecise data. This advancement has the potential to contribute significantly to the field of statistical analysis, particularly in situations where data uncertainty and imprecision are prevalent.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135730104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-18DOI: 10.3389/fams.2023.1256443
Pariksheet Nanda, Denise E. Kirschner
Mathematical and computational models of biological systems are increasingly complex, typically comprised of hybrid multi-scale methods such as ordinary differential equations, partial differential equations, agent-based and rule-based models, etc. These mechanistic models concurrently simulate detail at resolutions of whole host, multi-organ, organ, tissue, cellular, molecular, and genomic dynamics. Lacking analytical and numerical methods, solving complex biological models requires iterative parameter sampling-based approaches to establish appropriate ranges of model parameters that capture corresponding experimental datasets. However, these models typically comprise large numbers of parameters and therefore large degrees of freedom. Thus, fitting these models to multiple experimental datasets over time and space presents significant challenges. In this work we undertake the task of reviewing, testing, and advancing calibration practices across models and dataset types to compare methodologies for model calibration. Evaluating the process of calibrating models includes weighing strengths and applicability of each approach as well as standardizing calibration methods. Our work compares the performance of our model agnostic Calibration Protocol (CaliPro) with approximate Bayesian computing (ABC) to highlight strengths, weaknesses, synergies, and differences among these methods. We also present next-generation updates to CaliPro. We explore several model implementations and suggest a decision tree for selecting calibration approaches to match dataset types and modeling constraints.
{"title":"Calibration methods to fit parameters within complex biological models","authors":"Pariksheet Nanda, Denise E. Kirschner","doi":"10.3389/fams.2023.1256443","DOIUrl":"https://doi.org/10.3389/fams.2023.1256443","url":null,"abstract":"Mathematical and computational models of biological systems are increasingly complex, typically comprised of hybrid multi-scale methods such as ordinary differential equations, partial differential equations, agent-based and rule-based models, etc. These mechanistic models concurrently simulate detail at resolutions of whole host, multi-organ, organ, tissue, cellular, molecular, and genomic dynamics. Lacking analytical and numerical methods, solving complex biological models requires iterative parameter sampling-based approaches to establish appropriate ranges of model parameters that capture corresponding experimental datasets. However, these models typically comprise large numbers of parameters and therefore large degrees of freedom. Thus, fitting these models to multiple experimental datasets over time and space presents significant challenges. In this work we undertake the task of reviewing, testing, and advancing calibration practices across models and dataset types to compare methodologies for model calibration. Evaluating the process of calibrating models includes weighing strengths and applicability of each approach as well as standardizing calibration methods. Our work compares the performance of our model agnostic Calibration Protocol (CaliPro) with approximate Bayesian computing (ABC) to highlight strengths, weaknesses, synergies, and differences among these methods. We also present next-generation updates to CaliPro. We explore several model implementations and suggest a decision tree for selecting calibration approaches to match dataset types and modeling constraints.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135884259","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-17DOI: 10.3389/fams.2023.1150735
María del Pilar Montilla Velásquez, Martha Patricia Bohorquez Castañeda, Rafael Rentería Ramos
We propose a novel, efficient, and powerful methodology to deal with overdispersion, excess zeros, heterogeneity, and spatial correlation. It is based on the combination of Hurdle models and Spatial filtering Moran eigenvectors. Hurdle models are the best option to manage the presence of overdispersion and excess of zeros, separating the model into two parts: the first part models the probability of the zero value, and the second part models the probability of the non-zero values. Finally, gathering the spatial information in new covariates through a spatial filtering Moran vector method involves spatial correlation and spatial heterogeneity to improve the model fitting and explain spatial effects of variables that were not possible to measure. Thus, our proposal adapts usual regression models for count data so that it is possible to deal with phenomena where the usual theoretical assumptions, such as constant variance, independence, and unique distribution are not fulfilled. In addition, this research shows how a prolonged armed conflict can impact the health of children. The data includes children exposed to armed conflict in Colombia, a country enduring a non-international armed conflict lasting over 60 years. The findings indicate that children exposed to high levels of violence, as measured by the armed conflict index, demonstrate a significant association with the incidence and mortality rate of LAP in children. This fact is illustrated here using one of the most catastrophic conditions in childhood, as is Pediatric Acute Leukemia (LAP). The association between armed conflict and LAP has its conceptual basis in the epidemiology literature, given that, the incidence and mortality rates of neoplastic diseases increase with exposure to toxic and chronic stress during gestation and childhood. Our methodology provides a valuable framework for complex data analysis and contributes to understanding the health implications in conflict-affected regions.
{"title":"An implementation of Hurdle models for spatial count data. Study case: civil war as a risk factor for the development of childhood leukemia in Colombia","authors":"María del Pilar Montilla Velásquez, Martha Patricia Bohorquez Castañeda, Rafael Rentería Ramos","doi":"10.3389/fams.2023.1150735","DOIUrl":"https://doi.org/10.3389/fams.2023.1150735","url":null,"abstract":"We propose a novel, efficient, and powerful methodology to deal with overdispersion, excess zeros, heterogeneity, and spatial correlation. It is based on the combination of Hurdle models and Spatial filtering Moran eigenvectors. Hurdle models are the best option to manage the presence of overdispersion and excess of zeros, separating the model into two parts: the first part models the probability of the zero value, and the second part models the probability of the non-zero values. Finally, gathering the spatial information in new covariates through a spatial filtering Moran vector method involves spatial correlation and spatial heterogeneity to improve the model fitting and explain spatial effects of variables that were not possible to measure. Thus, our proposal adapts usual regression models for count data so that it is possible to deal with phenomena where the usual theoretical assumptions, such as constant variance, independence, and unique distribution are not fulfilled. In addition, this research shows how a prolonged armed conflict can impact the health of children. The data includes children exposed to armed conflict in Colombia, a country enduring a non-international armed conflict lasting over 60 years. The findings indicate that children exposed to high levels of violence, as measured by the armed conflict index, demonstrate a significant association with the incidence and mortality rate of LAP in children. This fact is illustrated here using one of the most catastrophic conditions in childhood, as is Pediatric Acute Leukemia (LAP). The association between armed conflict and LAP has its conceptual basis in the epidemiology literature, given that, the incidence and mortality rates of neoplastic diseases increase with exposure to toxic and chronic stress during gestation and childhood. Our methodology provides a valuable framework for complex data analysis and contributes to understanding the health implications in conflict-affected regions.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135992909","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-13DOI: 10.3389/fams.2023.1236586
Stephan Rosswog, Francesco Torsello, Peter Diener
We present version 1.0 of our Lagrangian numerical relativity code SPHINCS_BSSN . This code evolves the full set of Einstein equations, but contrary to other numerical relativity codes, it evolves the matter fluid via Lagrangian particles in the framework of a high-accuracy version of smooth particle hydrodynamics (SPH). The major new elements introduced here are: (i) a new method to map the stress–energy tensor (known at the particles) to the spacetime mesh, based on a local regression estimate; (ii) additional measures that ensure the robust evolution of a neutron star through its collapse to a black hole; and (iii) further refinements in how we place the SPH particles for our initial data. The latter are implemented in our code SPHINCS_ID which now, in addition to LORENE , can also couple to initial data produced by the initial data library FUKA . We discuss several simulations of neutron star mergers performed with SPHINCS_BSSN_v1.0 , including irrotational cases with and without prompt collapse and a system where only one of the stars has a large spin (χ = 0.5).
{"title":"The Lagrangian numerical relativity code SPHINCS_BSSN_v1.0","authors":"Stephan Rosswog, Francesco Torsello, Peter Diener","doi":"10.3389/fams.2023.1236586","DOIUrl":"https://doi.org/10.3389/fams.2023.1236586","url":null,"abstract":"We present version 1.0 of our Lagrangian numerical relativity code SPHINCS_BSSN . This code evolves the full set of Einstein equations, but contrary to other numerical relativity codes, it evolves the matter fluid via Lagrangian particles in the framework of a high-accuracy version of smooth particle hydrodynamics (SPH). The major new elements introduced here are: (i) a new method to map the stress–energy tensor (known at the particles) to the spacetime mesh, based on a local regression estimate; (ii) additional measures that ensure the robust evolution of a neutron star through its collapse to a black hole; and (iii) further refinements in how we place the SPH particles for our initial data. The latter are implemented in our code SPHINCS_ID which now, in addition to LORENE , can also couple to initial data produced by the initial data library FUKA . We discuss several simulations of neutron star mergers performed with SPHINCS_BSSN_v1.0 , including irrotational cases with and without prompt collapse and a system where only one of the stars has a large spin (χ = 0.5).","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135858727","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-13DOI: 10.3389/fams.2023.1241538
Paola Stolfi, Davide Vergni, Filippo Castiglione
Introduction Mathematical modeling has emerged as a crucial component in understanding the epidemiology of infectious diseases. In fact, contemporary surveillance efforts for epidemic or endemic infections heavily rely on mathematical and computational methods. This study presents a novel agent-based multi-level model that depicts the transmission dynamics of gonorrhea, a sexually transmitted infection (STI) caused by the bacterium Neisseria gonorrhoeae . This infection poses a significant public health challenge as it is endemic in numerous countries, and each year sees millions of new cases, including a concerning number of drug-resistant cases commonly referred to as gonorrhea superbugs or super gonorrhea. These drug-resistant strains exhibit a high level of resistance to recommended antibiotic treatments. Methods The proposed model incorporates a multi-layer network of agents' interaction representing the dynamics of sexual partnerships. It also encompasses a transmission model, which quantifies the probability of infection during sexual intercourse, and a within-host model, which captures the immune activation following gonorrhea infection in an individual. It is a combination of agent-based modeling, which effectively captures interactions among various risk groups, and probabilistic modeling, which enables a theoretical exploration of sexual network characteristics and contagion dynamics. Results Numerical simulations of the dynamics of gonorrhea infection using the complete agent-based model are carried out. In particular, some examples of possible epidemic evolution are presented together with an application to a real case study. The goal was to construct a virtual population that closely resembles the target population of interest. Discussion The uniqueness of this research lies in its objective to accurately depict the influence of distinct sexual risk groups and their interaction on the prevalence of gonorrhea. The proposed model, having interpretable and measurable parameters from epidemiological data, facilitates a more comprehensive understanding of the disease evolution.
{"title":"An agent-based multi-level model to study the spread of gonorrhea in different and interacting risk groups","authors":"Paola Stolfi, Davide Vergni, Filippo Castiglione","doi":"10.3389/fams.2023.1241538","DOIUrl":"https://doi.org/10.3389/fams.2023.1241538","url":null,"abstract":"Introduction Mathematical modeling has emerged as a crucial component in understanding the epidemiology of infectious diseases. In fact, contemporary surveillance efforts for epidemic or endemic infections heavily rely on mathematical and computational methods. This study presents a novel agent-based multi-level model that depicts the transmission dynamics of gonorrhea, a sexually transmitted infection (STI) caused by the bacterium Neisseria gonorrhoeae . This infection poses a significant public health challenge as it is endemic in numerous countries, and each year sees millions of new cases, including a concerning number of drug-resistant cases commonly referred to as gonorrhea superbugs or super gonorrhea. These drug-resistant strains exhibit a high level of resistance to recommended antibiotic treatments. Methods The proposed model incorporates a multi-layer network of agents' interaction representing the dynamics of sexual partnerships. It also encompasses a transmission model, which quantifies the probability of infection during sexual intercourse, and a within-host model, which captures the immune activation following gonorrhea infection in an individual. It is a combination of agent-based modeling, which effectively captures interactions among various risk groups, and probabilistic modeling, which enables a theoretical exploration of sexual network characteristics and contagion dynamics. Results Numerical simulations of the dynamics of gonorrhea infection using the complete agent-based model are carried out. In particular, some examples of possible epidemic evolution are presented together with an application to a real case study. The goal was to construct a virtual population that closely resembles the target population of interest. Discussion The uniqueness of this research lies in its objective to accurately depict the influence of distinct sexual risk groups and their interaction on the prevalence of gonorrhea. The proposed model, having interpretable and measurable parameters from epidemiological data, facilitates a more comprehensive understanding of the disease evolution.","PeriodicalId":36662,"journal":{"name":"Frontiers in Applied Mathematics and Statistics","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135855212","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}