{"title":"Generative learning for nonlinear dynamics","authors":"William Gilpin","doi":"10.1038/s42254-024-00688-2","DOIUrl":null,"url":null,"abstract":"Modern generative machine learning models are able to create realistic outputs far beyond their training data, such as photorealistic artwork, accurate protein structures or conversational text. These successes suggest that generative models learn to effectively parametrize and sample arbitrarily complex distributions. Beginning half a century ago, foundational works in nonlinear dynamics used tools from information theory for a similar purpose, namely, to infer properties of chaotic attractors from real-world time series. This Perspective article aims to connect these classical works to emerging themes in large-scale generative statistical learning. It focuses specifically on two classical problems: reconstructing dynamical manifolds given partial measurements, which parallels modern latent variable methods, and inferring minimal dynamical motifs underlying complicated data sets, which mirrors interpretability probes for trained models. Generative machine learning models seek to approximate and then sample the probability distribution of the data sets on which they are trained. This Perspective article connects these methods to historical studies of information processing and attractor geometry in nonlinear systems.","PeriodicalId":19024,"journal":{"name":"Nature Reviews Physics","volume":"6 3","pages":"194-206"},"PeriodicalIF":44.8000,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature Reviews Physics","FirstCategoryId":"101","ListUrlMain":"https://www.nature.com/articles/s42254-024-00688-2","RegionNum":1,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PHYSICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
Modern generative machine learning models are able to create realistic outputs far beyond their training data, such as photorealistic artwork, accurate protein structures or conversational text. These successes suggest that generative models learn to effectively parametrize and sample arbitrarily complex distributions. Beginning half a century ago, foundational works in nonlinear dynamics used tools from information theory for a similar purpose, namely, to infer properties of chaotic attractors from real-world time series. This Perspective article aims to connect these classical works to emerging themes in large-scale generative statistical learning. It focuses specifically on two classical problems: reconstructing dynamical manifolds given partial measurements, which parallels modern latent variable methods, and inferring minimal dynamical motifs underlying complicated data sets, which mirrors interpretability probes for trained models. Generative machine learning models seek to approximate and then sample the probability distribution of the data sets on which they are trained. This Perspective article connects these methods to historical studies of information processing and attractor geometry in nonlinear systems.
期刊介绍:
Nature Reviews Physics is an online-only reviews journal, part of the Nature Reviews portfolio of journals. It publishes high-quality technical reference, review, and commentary articles in all areas of fundamental and applied physics. The journal offers a range of content types, including Reviews, Perspectives, Roadmaps, Technical Reviews, Expert Recommendations, Comments, Editorials, Research Highlights, Features, and News & Views, which cover significant advances in the field and topical issues. Nature Reviews Physics is published monthly from January 2019 and does not have external, academic editors. Instead, all editorial decisions are made by a dedicated team of full-time professional editors.