Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems.

ArXiv Pub Date : 2025-01-13
Amber Hu, David Zoltowski, Aditya Nair, David Anderson, Lea Duncker, Scott Linderman
{"title":"Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems.","authors":"Amber Hu, David Zoltowski, Aditya Nair, David Anderson, Lea Duncker, Scott Linderman","doi":"","DOIUrl":null,"url":null,"abstract":"<p><p>Understanding how the collective activity of neural populations relates to computation and ultimately behavior is a key goal in neuroscience. To this end, statistical methods which describe high-dimensional neural time series in terms of low-dimensional latent dynamics have played a fundamental role in characterizing neural systems. Yet, what constitutes a successful method involves two opposing criteria: (1) methods should be expressive enough to capture complex nonlinear dynamics, and (2) they should maintain a notion of interpretability often only warranted by simpler linear models. In this paper, we develop an approach that balances these two objectives: the <i>Gaussian Process Switching Linear Dynamical System</i> (gpSLDS). Our method builds on previous work modeling the latent state evolution via a stochastic differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs). We propose a novel kernel function which enforces smoothly interpolated locally linear dynamics, and therefore expresses flexible - yet interpretable - dynamics akin to those of recurrent switching linear dynamical systems (rSLDS). Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics. To fit our models, we leverage a modified learning objective which improves the estimation accuracy of kernel hyperparameters compared to previous GP-SDE fitting approaches. We apply our method to synthetic data and data recorded in two neuroscience experiments and demonstrate favorable performance in comparison to the rSLDS.</p>","PeriodicalId":93888,"journal":{"name":"ArXiv","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11774443/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ArXiv","FirstCategoryId":"1085","ListUrlMain":"","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Understanding how the collective activity of neural populations relates to computation and ultimately behavior is a key goal in neuroscience. To this end, statistical methods which describe high-dimensional neural time series in terms of low-dimensional latent dynamics have played a fundamental role in characterizing neural systems. Yet, what constitutes a successful method involves two opposing criteria: (1) methods should be expressive enough to capture complex nonlinear dynamics, and (2) they should maintain a notion of interpretability often only warranted by simpler linear models. In this paper, we develop an approach that balances these two objectives: the Gaussian Process Switching Linear Dynamical System (gpSLDS). Our method builds on previous work modeling the latent state evolution via a stochastic differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs). We propose a novel kernel function which enforces smoothly interpolated locally linear dynamics, and therefore expresses flexible - yet interpretable - dynamics akin to those of recurrent switching linear dynamical systems (rSLDS). Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics. To fit our models, we leverage a modified learning objective which improves the estimation accuracy of kernel hyperparameters compared to previous GP-SDE fitting approaches. We apply our method to synthetic data and data recorded in two neuroscience experiments and demonstrate favorable performance in comparison to the rSLDS.

Abstract Image

Abstract Image

Abstract Image

分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
用高斯过程切换线性动力系统建模潜在神经动力学。
了解神经群体的集体活动如何与计算和最终行为相关是神经科学的一个关键目标。为此,用低维潜在动力学描述高维神经时间序列的统计方法在表征神经系统方面发挥了重要作用。然而,成功的方法包括两个相反的标准:(1)方法应该具有足够的表达能力来捕捉复杂的非线性动态,(2)它们应该保持一种可解释性的概念,这种概念通常只有在更简单的线性模型中才有保证。在本文中,我们开发了一种平衡这两个目标的方法:高斯过程切换线性动力系统(gpSLDS)。我们的方法建立在以前的工作基础上,通过一个随机微分方程来建模潜在状态演化,该方程的非线性动力学由高斯过程(GP-SDEs)描述。我们提出了一种新的核函数,它执行平滑内插的局部线性动力学,因此表达了类似于循环切换线性动力系统(rSLDS)的灵活但可解释的动力学。我们的方法解决了rSLDS的关键限制,例如在离散状态边界附近的动力学中的人为振荡,同时还提供了动力学的后验不确定性估计。为了拟合我们的模型,我们利用了一个改进的学习目标,与以前的GP-SDE拟合方法相比,该目标提高了核超参数的估计精度。我们将该方法应用于合成数据和两个神经科学实验中记录的数据,并与rSLDS相比显示出良好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Knowledge Distillation of a Protein Language Model Yields a Foundational Implicit Solvent Model. Linear Acceleration Is a Primary Risk Factor for Concussion and a Target for Prevention. CAMEL: An ECG Language Model for Forecasting Cardiac Events. Hierarchical Multiscale Structure-Function Coupling for Brain Connectome Integration. GOLDMARK: Governed Outcome-Linked Diagnostic Model Assessment Reference Kit.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1