Christophe Andrieu, Nicolas Chopin, Ettore Fincato, Mathieu Gerber
In this paper we propose a novel, general purpose, algorithm to optimize functions $lcolon mathbb{R}^d rightarrow mathbb{R}$ not assumed to be convex or differentiable or even continuous. The main idea is to sequentially fit a sequence of parametric probability densities, possessing a concentration property, to $l$ using a Bayesian update followed by a reprojection back onto the chosen parametric sequence. Remarkably, with the sequence chosen to be from the exponential family, reprojection essentially boils down to the computation of expectations. Our algorithm therefore lends itself to Monte Carlo approximation, ranging from plain to Sequential Monte Carlo (SMC) methods. The algorithm is therefore particularly simple to implement and we illustrate performance on a challenging Machine Learning classification problem. Our methodology naturally extends to the scenario where only noisy measurements of $l$ are available and retains ease of implementation and performance. At a theoretical level we establish, in a fairly general scenario, that our framework can be viewed as implicitly implementing a time inhomogeneous gradient descent algorithm on a sequence of smoothed approximations of $l$. This opens the door to establishing convergence of the algorithm and provide theoretical guarantees. Along the way, we establish new results for inhomogeneous gradient descent algorithms of independent interest.
{"title":"Gradient-free optimization via integration","authors":"Christophe Andrieu, Nicolas Chopin, Ettore Fincato, Mathieu Gerber","doi":"arxiv-2408.00888","DOIUrl":"https://doi.org/arxiv-2408.00888","url":null,"abstract":"In this paper we propose a novel, general purpose, algorithm to optimize\u0000functions $lcolon mathbb{R}^d rightarrow mathbb{R}$ not assumed to be\u0000convex or differentiable or even continuous. The main idea is to sequentially\u0000fit a sequence of parametric probability densities, possessing a concentration\u0000property, to $l$ using a Bayesian update followed by a reprojection back onto\u0000the chosen parametric sequence. Remarkably, with the sequence chosen to be from\u0000the exponential family, reprojection essentially boils down to the computation\u0000of expectations. Our algorithm therefore lends itself to Monte Carlo\u0000approximation, ranging from plain to Sequential Monte Carlo (SMC) methods. The algorithm is therefore particularly simple to implement and we illustrate\u0000performance on a challenging Machine Learning classification problem. Our\u0000methodology naturally extends to the scenario where only noisy measurements of\u0000$l$ are available and retains ease of implementation and performance. At a\u0000theoretical level we establish, in a fairly general scenario, that our\u0000framework can be viewed as implicitly implementing a time inhomogeneous\u0000gradient descent algorithm on a sequence of smoothed approximations of $l$.\u0000This opens the door to establishing convergence of the algorithm and provide\u0000theoretical guarantees. Along the way, we establish new results for\u0000inhomogeneous gradient descent algorithms of independent interest.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141937247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Léa Loisel, Vincent Raquin, Maxime Ratinier, Pauline Ezanno, Gaël Beaunée
Arboviruses represent a significant threat to human, animal, and plant health worldwide. To elucidate transmission, anticipate their spread and efficiently control them, mechanistic modelling has proven its usefulness. However, most models rely on assumptions about how the extrinsic incubation period (EIP) is represented: the intra-vector viral dynamics (IVD), occurring during the EIP, is approximated by a single state. After an average duration, all exposed vectors become infectious. Behind this are hidden two strong hypotheses: (i) EIP is exponentially distributed in the vector population; (ii) viruses successfully cross the infection, dissemination, and transmission barriers in all exposed vectors. To assess these hypotheses, we developed a stochastic compartmental model which represents successive IVD stages, associated to the crossing or not of these three barriers. We calibrated the model using an ABC-SMC (Approximate Bayesian Computation - Sequential Monte Carlo) method with model selection. We systematically searched for literature data on experimental infections of Aedes mosquitoes infected by either dengue, chikungunya, or Zika viruses. We demonstrated the discrepancy between the exponential hypothesis and observed EIP distributions for dengue and Zika viruses and identified more relevant EIP distributions . We also quantified the fraction of infected mosquitoes eventually becoming infectious, highlighting that often only a small fraction crosses the three barriers. This work provides a generic modelling framework applicable to other arboviruses for which similar data are available. Our model can also be coupled to population-scale models to aid future arbovirus control.
{"title":"Within-vector viral dynamics challenges how to model the extrinsic incubation period for major arboviruses: dengue, Zika, and chikungunya","authors":"Léa Loisel, Vincent Raquin, Maxime Ratinier, Pauline Ezanno, Gaël Beaunée","doi":"arxiv-2408.00409","DOIUrl":"https://doi.org/arxiv-2408.00409","url":null,"abstract":"Arboviruses represent a significant threat to human, animal, and plant health\u0000worldwide. To elucidate transmission, anticipate their spread and efficiently\u0000control them, mechanistic modelling has proven its usefulness. However, most\u0000models rely on assumptions about how the extrinsic incubation period (EIP) is\u0000represented: the intra-vector viral dynamics (IVD), occurring during the EIP,\u0000is approximated by a single state. After an average duration, all exposed\u0000vectors become infectious. Behind this are hidden two strong hypotheses: (i)\u0000EIP is exponentially distributed in the vector population; (ii) viruses\u0000successfully cross the infection, dissemination, and transmission barriers in\u0000all exposed vectors. To assess these hypotheses, we developed a stochastic\u0000compartmental model which represents successive IVD stages, associated to the\u0000crossing or not of these three barriers. We calibrated the model using an\u0000ABC-SMC (Approximate Bayesian Computation - Sequential Monte Carlo) method with\u0000model selection. We systematically searched for literature data on experimental\u0000infections of Aedes mosquitoes infected by either dengue, chikungunya, or Zika\u0000viruses. We demonstrated the discrepancy between the exponential hypothesis and\u0000observed EIP distributions for dengue and Zika viruses and identified more\u0000relevant EIP distributions . We also quantified the fraction of infected\u0000mosquitoes eventually becoming infectious, highlighting that often only a small\u0000fraction crosses the three barriers. This work provides a generic modelling\u0000framework applicable to other arboviruses for which similar data are available.\u0000Our model can also be coupled to population-scale models to aid future\u0000arbovirus control.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141884144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Recent advancements in understanding the brain's functional organization related to behavior have been pivotal, particularly in the development of predictive models based on brain connectivity. Traditional methods in this domain often involve a two-step process by first constructing a connectivity matrix from predefined brain regions, and then linking these connections to behaviors or clinical outcomes. However, these approaches with unsupervised node partitions predict outcomes inefficiently with independently established connectivity. In this paper, we introduce the Supervised Brain Parcellation (SBP), a brain node parcellation scheme informed by the downstream predictive task. With voxel-level functional time courses generated under resting-state or cognitive tasks as input, our approach clusters voxels into nodes in a manner that maximizes the correlation between inter-node connections and the behavioral outcome, while also accommodating intra-node homogeneity. We rigorously evaluate the SBP approach using resting-state and task-based fMRI data from both the Adolescent Brain Cognitive Development (ABCD) study and the Human Connectome Project (HCP). Our analyses show that SBP significantly improves out-of-sample connectome-based predictive performance compared to conventional step-wise methods under various brain atlases. This advancement holds promise for enhancing our understanding of brain functional architectures with behavior and establishing more informative network neuromarkers for clinical applications.
{"title":"Supervised brain node and network construction under voxel-level functional imaging","authors":"Wanwan Xu, Selena Wang, Chichun Tan, Xilin Shen, Wenjing Luo, Todd Constable, Tianxi Li, Yize Zhao","doi":"arxiv-2407.21242","DOIUrl":"https://doi.org/arxiv-2407.21242","url":null,"abstract":"Recent advancements in understanding the brain's functional organization\u0000related to behavior have been pivotal, particularly in the development of\u0000predictive models based on brain connectivity. Traditional methods in this\u0000domain often involve a two-step process by first constructing a connectivity\u0000matrix from predefined brain regions, and then linking these connections to\u0000behaviors or clinical outcomes. However, these approaches with unsupervised\u0000node partitions predict outcomes inefficiently with independently established\u0000connectivity. In this paper, we introduce the Supervised Brain Parcellation\u0000(SBP), a brain node parcellation scheme informed by the downstream predictive\u0000task. With voxel-level functional time courses generated under resting-state or\u0000cognitive tasks as input, our approach clusters voxels into nodes in a manner\u0000that maximizes the correlation between inter-node connections and the\u0000behavioral outcome, while also accommodating intra-node homogeneity. We\u0000rigorously evaluate the SBP approach using resting-state and task-based fMRI\u0000data from both the Adolescent Brain Cognitive Development (ABCD) study and the\u0000Human Connectome Project (HCP). Our analyses show that SBP significantly\u0000improves out-of-sample connectome-based predictive performance compared to\u0000conventional step-wise methods under various brain atlases. This advancement\u0000holds promise for enhancing our understanding of brain functional architectures\u0000with behavior and establishing more informative network neuromarkers for\u0000clinical applications.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141866089","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we examine the Sample Average Approximation (SAA) procedure within a framework where the Monte Carlo estimator of the expectation is biased. We also introduce Multilevel Monte Carlo (MLMC) in the SAA setup to enhance the computational efficiency of solving optimization problems. In this context, we conduct a thorough analysis, exploiting Cram'er's large deviation theory, to establish uniform convergence, quantify the convergence rate, and determine the sample complexity for both standard Monte Carlo and MLMC paradigms. Additionally, we perform a root-mean-squared error analysis utilizing tools from empirical process theory to derive sample complexity without relying on the finite moment condition typically required for uniform convergence results. Finally, we validate our findings and demonstrate the advantages of the MLMC estimator through numerical examples, estimating Conditional Value-at-Risk (CVaR) in the Geometric Brownian Motion and nested expectation framework.
在本文中,我们在蒙特卡罗期望估计器有偏差的框架下研究了样本平均逼近(SAA)程序。我们还在 SAA 设置中引入了多级蒙特卡罗(MLMC),以提高解决优化问题的计算效率。在此背景下,我们利用克拉姆(Cram'er)的大偏差理论(large deviationtheory)进行了深入分析,为标准蒙特卡罗和 MLMC 范式建立了均匀收敛性、量化了收敛速率并确定了样本复杂度。此外,我们还利用经验过程理论的工具进行了均方根误差分析,得出了样本复杂度,而无需依赖均匀收敛结果通常需要的有限矩条件。最后,我们通过数值示例验证了我们的发现,并证明了 MLMC 估计器的优势,即在几何布朗运动和嵌套期望框架下估计条件风险值(CVaR)。
{"title":"Multilevel Monte Carlo in Sample Average Approximation: Convergence, Complexity and Application","authors":"Devang Sinha, Siddhartha P. Chakrabarty","doi":"arxiv-2407.18504","DOIUrl":"https://doi.org/arxiv-2407.18504","url":null,"abstract":"In this paper, we examine the Sample Average Approximation (SAA) procedure\u0000within a framework where the Monte Carlo estimator of the expectation is\u0000biased. We also introduce Multilevel Monte Carlo (MLMC) in the SAA setup to\u0000enhance the computational efficiency of solving optimization problems. In this\u0000context, we conduct a thorough analysis, exploiting Cram'er's large deviation\u0000theory, to establish uniform convergence, quantify the convergence rate, and\u0000determine the sample complexity for both standard Monte Carlo and MLMC\u0000paradigms. Additionally, we perform a root-mean-squared error analysis\u0000utilizing tools from empirical process theory to derive sample complexity\u0000without relying on the finite moment condition typically required for uniform\u0000convergence results. Finally, we validate our findings and demonstrate the\u0000advantages of the MLMC estimator through numerical examples, estimating\u0000Conditional Value-at-Risk (CVaR) in the Geometric Brownian Motion and nested\u0000expectation framework.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141865988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Naichen Shi, Hao Yan, Shenghan Guo, Raed Al Kontar
In this paper, we present a generic physics-informed generative model called MPDM that integrates multi-fidelity physics simulations with diffusion models. MPDM categorizes multi-fidelity physics simulations into inexpensive and expensive simulations, depending on computational costs. The inexpensive simulations, which can be obtained with low latency, directly inject contextual information into DDMs. Furthermore, when results from expensive simulations are available, MPDM refines the quality of generated samples via a guided diffusion process. This design separates the training of a denoising diffusion model from physics-informed conditional probability models, thus lending flexibility to practitioners. MPDM builds on Bayesian probabilistic models and is equipped with a theoretical guarantee that provides upper bounds on the Wasserstein distance between the sample and underlying true distribution. The probabilistic nature of MPDM also provides a convenient approach for uncertainty quantification in prediction. Our models excel in cases where physics simulations are imperfect and sometimes inaccessible. We use a numerical simulation in fluid dynamics and a case study in heat dynamics within laser-based metal powder deposition additive manufacturing to demonstrate how MPDM seamlessly integrates multi-idelity physics simulations and observations to obtain surrogates with superior predictive performance.
{"title":"Multi-physics Simulation Guided Generative Diffusion Models with Applications in Fluid and Heat Dynamics","authors":"Naichen Shi, Hao Yan, Shenghan Guo, Raed Al Kontar","doi":"arxiv-2407.17720","DOIUrl":"https://doi.org/arxiv-2407.17720","url":null,"abstract":"In this paper, we present a generic physics-informed generative model called\u0000MPDM that integrates multi-fidelity physics simulations with diffusion models.\u0000MPDM categorizes multi-fidelity physics simulations into inexpensive and\u0000expensive simulations, depending on computational costs. The inexpensive\u0000simulations, which can be obtained with low latency, directly inject contextual\u0000information into DDMs. Furthermore, when results from expensive simulations are\u0000available, MPDM refines the quality of generated samples via a guided diffusion\u0000process. This design separates the training of a denoising diffusion model from\u0000physics-informed conditional probability models, thus lending flexibility to\u0000practitioners. MPDM builds on Bayesian probabilistic models and is equipped\u0000with a theoretical guarantee that provides upper bounds on the Wasserstein\u0000distance between the sample and underlying true distribution. The probabilistic\u0000nature of MPDM also provides a convenient approach for uncertainty\u0000quantification in prediction. Our models excel in cases where physics\u0000simulations are imperfect and sometimes inaccessible. We use a numerical\u0000simulation in fluid dynamics and a case study in heat dynamics within\u0000laser-based metal powder deposition additive manufacturing to demonstrate how\u0000MPDM seamlessly integrates multi-idelity physics simulations and observations\u0000to obtain surrogates with superior predictive performance.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141775382","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Giovanni Brigati, Gabriel Stoltz, Andi Q. Wang, Lihan Wang
We study the long-time convergence behavior of underdamped Langevin dynamics, when the spatial equilibrium satisfies a weighted Poincar'e inequality, with a general velocity distribution, which allows for fat-tail or subexponential potential energies, and provide constructive and fully explicit estimates in $mathrm{L}^2$-norm with $mathrm{L}^infty$ initial conditions. A key ingredient is a space-time weighted Poincar'e--Lions inequality, which in turn implies a weak Poincar'e--Lions inequality.
{"title":"Explicit convergence rates of underdamped Langevin dynamics under weighted and weak Poincaré--Lions inequalities","authors":"Giovanni Brigati, Gabriel Stoltz, Andi Q. Wang, Lihan Wang","doi":"arxiv-2407.16033","DOIUrl":"https://doi.org/arxiv-2407.16033","url":null,"abstract":"We study the long-time convergence behavior of underdamped Langevin dynamics,\u0000when the spatial equilibrium satisfies a weighted Poincar'e inequality, with a\u0000general velocity distribution, which allows for fat-tail or subexponential\u0000potential energies, and provide constructive and fully explicit estimates in\u0000$mathrm{L}^2$-norm with $mathrm{L}^infty$ initial conditions. A key\u0000ingredient is a space-time weighted Poincar'e--Lions inequality, which in turn\u0000implies a weak Poincar'e--Lions inequality.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141775383","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H. Sherry Zhang, Dianne Cook, Nicolas Langrené, Jessica Wai Yin Leung
The projection pursuit (PP) guided tour interactively optimises a criteria function known as the PP index, to explore high-dimensional data by revealing interesting projections. The optimisation in PP can be non-trivial, involving non-smooth functions and optima with a small squint angle, detectable only from close proximity. To address these challenges, this study investigates the performance of a recently introduced swarm-based algorithm, Jellyfish Search Optimiser (JSO), for optimising PP indexes. The performance of JSO for visualising data is evaluated across various hyper-parameter settings and compared with existing optimisers. Additionally, this work proposes novel methods to quantify two properties of the PP index, smoothness and squintability that capture the complexities inherent in PP optimisation problems. These two metrics are evaluated along with JSO hyper-parameters to determine their effects on JSO success rate. Our numerical results confirm the positive impact of these metrics on the JSO success rate, with squintability being the most significant. The JSO algorithm has been implemented in the tourr package and functions to calculate smoothness and squintability are available in the ferrn package.
{"title":"Studying the Performance of the Jellyfish Search Optimiser for the Application of Projection Pursuit","authors":"H. Sherry Zhang, Dianne Cook, Nicolas Langrené, Jessica Wai Yin Leung","doi":"arxiv-2407.13663","DOIUrl":"https://doi.org/arxiv-2407.13663","url":null,"abstract":"The projection pursuit (PP) guided tour interactively optimises a criteria\u0000function known as the PP index, to explore high-dimensional data by revealing\u0000interesting projections. The optimisation in PP can be non-trivial, involving\u0000non-smooth functions and optima with a small squint angle, detectable only from\u0000close proximity. To address these challenges, this study investigates the\u0000performance of a recently introduced swarm-based algorithm, Jellyfish Search\u0000Optimiser (JSO), for optimising PP indexes. The performance of JSO for\u0000visualising data is evaluated across various hyper-parameter settings and\u0000compared with existing optimisers. Additionally, this work proposes novel\u0000methods to quantify two properties of the PP index, smoothness and\u0000squintability that capture the complexities inherent in PP optimisation\u0000problems. These two metrics are evaluated along with JSO hyper-parameters to\u0000determine their effects on JSO success rate. Our numerical results confirm the\u0000positive impact of these metrics on the JSO success rate, with squintability\u0000being the most significant. The JSO algorithm has been implemented in the tourr\u0000package and functions to calculate smoothness and squintability are available\u0000in the ferrn package.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141743467","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lingbin Bian, Nizhuan Wang, Yuanning Li, Adeel Razi, Qian Wang, Han Zhang, Dinggang Shen, the UNC/UMN Baby Connectome Project Consortium
The segregation and integration of infant brain networks undergo tremendous changes due to the rapid development of brain function and organization. Traditional methods for estimating brain modularity usually rely on group-averaged functional connectivity (FC), often overlooking individual variability. To address this, we introduce a novel approach utilizing Bayesian modeling to analyze the dynamic development of functional modules in infants over time. This method retains inter-individual variability and, in comparison to conventional group averaging techniques, more effectively detects modules, taking into account the stationarity of module evolution. Furthermore, we explore gender differences in module development under awake and sleep conditions by assessing modular similarities. Our results show that female infants demonstrate more distinct modular structures between these two conditions, possibly implying relative quiet and restful sleep compared with male infants.
{"title":"Evaluating the evolution and inter-individual variability of infant functional module development from 0 to 5 years old","authors":"Lingbin Bian, Nizhuan Wang, Yuanning Li, Adeel Razi, Qian Wang, Han Zhang, Dinggang Shen, the UNC/UMN Baby Connectome Project Consortium","doi":"arxiv-2407.13118","DOIUrl":"https://doi.org/arxiv-2407.13118","url":null,"abstract":"The segregation and integration of infant brain networks undergo tremendous\u0000changes due to the rapid development of brain function and organization.\u0000Traditional methods for estimating brain modularity usually rely on\u0000group-averaged functional connectivity (FC), often overlooking individual\u0000variability. To address this, we introduce a novel approach utilizing Bayesian\u0000modeling to analyze the dynamic development of functional modules in infants\u0000over time. This method retains inter-individual variability and, in comparison\u0000to conventional group averaging techniques, more effectively detects modules,\u0000taking into account the stationarity of module evolution. Furthermore, we\u0000explore gender differences in module development under awake and sleep\u0000conditions by assessing modular similarities. Our results show that female\u0000infants demonstrate more distinct modular structures between these two\u0000conditions, possibly implying relative quiet and restful sleep compared with\u0000male infants.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141743684","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We assess an emerging simulation research method -- Inverse Generative Social Science (IGSS) citep{Epstein23a} -- that harnesses the power of evolution by natural selection to model and explain complex targets. Drawing on a review of recent papers that use IGSS, and by applying it in two different studies of conflict, we here assess its potential both as a modelling approach and as formal theory. We find that IGSS has potential for research in studies of organistions. IGSS offers two huge advantages over most other approaches to modelling. 1) IGSS has the potential to fit complex non-linear models to a target and 2) the models have the potential to be interpreted as social theory. The paper presents IGSS to a new audience, illustrates how it can contribute, and provides software that can be used as a basis of an IGSS study.
{"title":"Examining inverse generative social science to study targets of interest","authors":"Thomas Chesney, Asif Jaffer, Robert Pasley","doi":"arxiv-2407.13474","DOIUrl":"https://doi.org/arxiv-2407.13474","url":null,"abstract":"We assess an emerging simulation research method -- Inverse Generative Social\u0000Science (IGSS) citep{Epstein23a} -- that harnesses the power of evolution by\u0000natural selection to model and explain complex targets. Drawing on a review of recent papers that use IGSS, and by applying it in two\u0000different studies of conflict, we here assess its potential both as a modelling\u0000approach and as formal theory. We find that IGSS has potential for research in studies of organistions. IGSS\u0000offers two huge advantages over most other approaches to modelling. 1) IGSS has\u0000the potential to fit complex non-linear models to a target and 2) the models\u0000have the potential to be interpreted as social theory. The paper presents IGSS to a new audience, illustrates how it can contribute,\u0000and provides software that can be used as a basis of an IGSS study.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141743468","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Vincent Guigues, Anton J. Kleywegt, Giovanni Amorim, Andre Krauss, Victor Hugo Nascimento
This is the User Manual of LASPATED library. This library is available on GitHub (at https://github.com/vguigues/LASPATED)) and provides a set of tools to analyze spatiotemporal data. A video tutorial for this library is available on Youtube. It is made of a Python package for time and space discretizations and of two packages (one in Matlab and one in C++) implementing the calibration of the probabilistic models for stochastic spatio-temporal data proposed in the companion paper arXiv:2203.16371v2.
{"title":"LASPATED: A Library for the Analysis of Spatio-Temporal Discrete Data (User Manual)","authors":"Vincent Guigues, Anton J. Kleywegt, Giovanni Amorim, Andre Krauss, Victor Hugo Nascimento","doi":"arxiv-2407.13889","DOIUrl":"https://doi.org/arxiv-2407.13889","url":null,"abstract":"This is the User Manual of LASPATED library. This library is available on\u0000GitHub (at https://github.com/vguigues/LASPATED)) and provides a set of tools\u0000to analyze spatiotemporal data. A video tutorial for this library is available\u0000on Youtube. It is made of a Python package for time and space discretizations\u0000and of two packages (one in Matlab and one in C++) implementing the calibration\u0000of the probabilistic models for stochastic spatio-temporal data proposed in the\u0000companion paper arXiv:2203.16371v2.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141743466","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}