Robert L. Peach, Matteo Vinao-Carl, Nir Grossman, Michael David, Emma Mallas, David Sharp, Paresh A. Malhotra, Pierre Vandergheynst, Adam Gosztolai
{"title":"Implicit Gaussian process representation of vector fields over arbitrary latent manifolds","authors":"Robert L. Peach, Matteo Vinao-Carl, Nir Grossman, Michael David, Emma Mallas, David Sharp, Paresh A. Malhotra, Pierre Vandergheynst, Adam Gosztolai","doi":"arxiv-2309.16746","DOIUrl":null,"url":null,"abstract":"Gaussian processes (GPs) are popular nonparametric statistical models for\nlearning unknown functions and quantifying the spatiotemporal uncertainty in\ndata. Recent works have extended GPs to model scalar and vector quantities\ndistributed over non-Euclidean domains, including smooth manifolds appearing in\nnumerous fields such as computer vision, dynamical systems, and neuroscience.\nHowever, these approaches assume that the manifold underlying the data is\nknown, limiting their practical utility. We introduce RVGP, a generalisation of\nGPs for learning vector signals over latent Riemannian manifolds. Our method\nuses positional encoding with eigenfunctions of the connection Laplacian,\nassociated with the tangent bundle, readily derived from common graph-based\napproximation of data. We demonstrate that RVGP possesses global regularity\nover the manifold, which allows it to super-resolve and inpaint vector fields\nwhile preserving singularities. Furthermore, we use RVGP to reconstruct\nhigh-density neural dynamics derived from low-density EEG recordings in healthy\nindividuals and Alzheimer's patients. We show that vector field singularities\nare important disease markers and that their reconstruction leads to a\ncomparable classification accuracy of disease states to high-density\nrecordings. Thus, our method overcomes a significant practical limitation in\nexperimental and clinical applications.","PeriodicalId":501256,"journal":{"name":"arXiv - CS - Mathematical Software","volume":"17 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-09-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Mathematical Software","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2309.16746","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Gaussian processes (GPs) are popular nonparametric statistical models for
learning unknown functions and quantifying the spatiotemporal uncertainty in
data. Recent works have extended GPs to model scalar and vector quantities
distributed over non-Euclidean domains, including smooth manifolds appearing in
numerous fields such as computer vision, dynamical systems, and neuroscience.
However, these approaches assume that the manifold underlying the data is
known, limiting their practical utility. We introduce RVGP, a generalisation of
GPs for learning vector signals over latent Riemannian manifolds. Our method
uses positional encoding with eigenfunctions of the connection Laplacian,
associated with the tangent bundle, readily derived from common graph-based
approximation of data. We demonstrate that RVGP possesses global regularity
over the manifold, which allows it to super-resolve and inpaint vector fields
while preserving singularities. Furthermore, we use RVGP to reconstruct
high-density neural dynamics derived from low-density EEG recordings in healthy
individuals and Alzheimer's patients. We show that vector field singularities
are important disease markers and that their reconstruction leads to a
comparable classification accuracy of disease states to high-density
recordings. Thus, our method overcomes a significant practical limitation in
experimental and clinical applications.