首页 > 最新文献

SIAM Review最新文献

英文 中文
Book Review:; Essential Statistics for Data Science: A Concise Crash Course
IF 10.2 1区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-02-06 DOI: 10.1137/24m167562x
David Banks
SIAM Review, Volume 67, Issue 1, Page 206-207, March 2025.
This is a bold book! Professor Zhu wants to provide the basic statistical knowledge needed by data scientists in a super-short volume. It reminds me a bit of Larry Wasserman’s All of Statistics (Springer, 2014), but is aimed at Masters students (often from fields other than statistics) or advanced undergraduates (also often from other fields). As an attendee at far too many faculty meetings, I applaud brevity and focus. As an amateur stylist, I admire strong technical writing. And as an applied statistician who has taught basic statistics to Masters and Ph.D. students from other disciplines, I appreciate the need for a book of this kind. For the right course I would happily use this book, although I would need to supplement it with other material.
{"title":"Book Review:; Essential Statistics for Data Science: A Concise Crash Course","authors":"David Banks","doi":"10.1137/24m167562x","DOIUrl":"https://doi.org/10.1137/24m167562x","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 206-207, March 2025. <br/> This is a bold book! Professor Zhu wants to provide the basic statistical knowledge needed by data scientists in a super-short volume. It reminds me a bit of Larry Wasserman’s All of Statistics (Springer, 2014), but is aimed at Masters students (often from fields other than statistics) or advanced undergraduates (also often from other fields). As an attendee at far too many faculty meetings, I applaud brevity and focus. As an amateur stylist, I admire strong technical writing. And as an applied statistician who has taught basic statistics to Masters and Ph.D. students from other disciplines, I appreciate the need for a book of this kind. For the right course I would happily use this book, although I would need to supplement it with other material.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"79 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Troublesome Kernel: On Hallucinations, No Free Lunches, and the Accuracy-Stability Tradeoff in Inverse Problems
IF 10.2 1区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-02-06 DOI: 10.1137/23m1568739
Nina M. Gottschling, Vegard Antun, Anders C. Hansen, Ben Adcock
SIAM Review, Volume 67, Issue 1, Page 73-104, March 2025.
Abstract.Methods inspired by artificial intelligence (AI) are starting to fundamentally change computational science and engineering through breakthrough performance on challenging problems. However, the reliability and trustworthiness of such techniques is a major concern. In inverse problems in imaging, the focus of this paper, there is increasing empirical evidence that methods may suffer from hallucinations, i.e., false, but realistic-looking artifacts; instability, i.e., sensitivity to perturbations in the data; and unpredictable generalization, i.e., excellent performance on some images, but significant deterioration on others. This paper provides a theoretical foundation for these phenomena. We give mathematical explanations for how and when such effects arise in arbitrary reconstruction methods, with several of our results taking the form of “no free lunch” theorems. Specifically, we show that (i) methods that overperform on a single image can wrongly transfer details from one image to another, creating a hallucination; (ii) methods that overperform on two or more images can hallucinate or be unstable; (iii) optimizing the accuracy-stability tradeoff is generally difficult; (iv) hallucinations and instabilities, if they occur, are not rare events and may be encouraged by standard training; and (v) it may be impossible to construct optimal reconstruction maps for certain problems. Our results trace these effects to the kernel of the forward operator whenever it is nontrivial, but also apply to the case when the forward operator is ill-conditioned. Based on these insights, our work aims to spur research into new ways to develop robust and reliable AI-based methods for inverse problems in imaging.
{"title":"The Troublesome Kernel: On Hallucinations, No Free Lunches, and the Accuracy-Stability Tradeoff in Inverse Problems","authors":"Nina M. Gottschling, Vegard Antun, Anders C. Hansen, Ben Adcock","doi":"10.1137/23m1568739","DOIUrl":"https://doi.org/10.1137/23m1568739","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 73-104, March 2025. <br/> Abstract.Methods inspired by artificial intelligence (AI) are starting to fundamentally change computational science and engineering through breakthrough performance on challenging problems. However, the reliability and trustworthiness of such techniques is a major concern. In inverse problems in imaging, the focus of this paper, there is increasing empirical evidence that methods may suffer from hallucinations, i.e., false, but realistic-looking artifacts; instability, i.e., sensitivity to perturbations in the data; and unpredictable generalization, i.e., excellent performance on some images, but significant deterioration on others. This paper provides a theoretical foundation for these phenomena. We give mathematical explanations for how and when such effects arise in arbitrary reconstruction methods, with several of our results taking the form of “no free lunch” theorems. Specifically, we show that (i) methods that overperform on a single image can wrongly transfer details from one image to another, creating a hallucination; (ii) methods that overperform on two or more images can hallucinate or be unstable; (iii) optimizing the accuracy-stability tradeoff is generally difficult; (iv) hallucinations and instabilities, if they occur, are not rare events and may be encouraged by standard training; and (v) it may be impossible to construct optimal reconstruction maps for certain problems. Our results trace these effects to the kernel of the forward operator whenever it is nontrivial, but also apply to the case when the forward operator is ill-conditioned. Based on these insights, our work aims to spur research into new ways to develop robust and reliable AI-based methods for inverse problems in imaging.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"123 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258252","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Book Review:; Numerical Methods in Physics with Python. Second Edition
IF 10.2 1区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-02-06 DOI: 10.1137/24m1650466
Gabriele Ciaramella
SIAM Review, Volume 67, Issue 1, Page 204-205, March 2025.
Numerical Methods in Physics with Python by Alex Gezerlis is an excellent example of a textbook built on long and established teaching experience. The goals are clearly defined in the preface: Gezerlis aims to gently introduce undergraduate physics students to the branch of numerical methods and their concrete implementation in Python. To this end, the author considers a physics-applications-first approach. Every chapter begins with a motivation section on real physics problems (simple but adequate for undergraduate students), ends with a concrete project on a physics application, and is completed by a rich list of exercises often designed with a physics appeal.
{"title":"Book Review:; Numerical Methods in Physics with Python. Second Edition","authors":"Gabriele Ciaramella","doi":"10.1137/24m1650466","DOIUrl":"https://doi.org/10.1137/24m1650466","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 204-205, March 2025. <br/> Numerical Methods in Physics with Python by Alex Gezerlis is an excellent example of a textbook built on long and established teaching experience. The goals are clearly defined in the preface: Gezerlis aims to gently introduce undergraduate physics students to the branch of numerical methods and their concrete implementation in Python. To this end, the author considers a physics-applications-first approach. Every chapter begins with a motivation section on real physics problems (simple but adequate for undergraduate students), ends with a concrete project on a physics application, and is completed by a rich list of exercises often designed with a physics appeal.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"140 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258249","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Limits of Learning Dynamical Systems
IF 10.2 1区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-02-06 DOI: 10.1137/24m1696974
Tyrus Berry, Suddhasattwa Das
SIAM Review, Volume 67, Issue 1, Page 107-137, March 2025.
Abstract.A dynamical system is a transformation of a phase space, and the transformation law is the primary means of defining as well as identifying the dynamical system and is the object of focus of many learning techniques. However, there are many secondary aspects of dynamical systems—invariant sets, the Koopman operator, and Markov approximations—that provide alternative objectives for learning techniques. Crucially, while many learning methods are focused on the transformation law, we find that forecast performance can depend on how well these other aspects of the dynamics are approximated. These different facets of a dynamical system correspond to objects in completely different spaces—namely, interpolation spaces, compact Hausdorff sets, unitary operators, and Markov operators, respectively. Thus, learning techniques targeting any of these four facets perform different kinds of approximations. We examine whether an approximation of any one of these aspects of the dynamics could lead to an approximation of another facet. Many connections and obstructions are brought to light in this analysis. Special focus is placed on methods of learning the primary feature—the dynamics law itself. The main question considered is the connection between learning this law and reconstructing the Koopman operator and the invariant set. The answers are tied to the ergodic and topological properties of the dynamics, and they reveal how these properties determine the limits of forecasting techniques.
{"title":"Limits of Learning Dynamical Systems","authors":"Tyrus Berry, Suddhasattwa Das","doi":"10.1137/24m1696974","DOIUrl":"https://doi.org/10.1137/24m1696974","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 107-137, March 2025. <br/> Abstract.A dynamical system is a transformation of a phase space, and the transformation law is the primary means of defining as well as identifying the dynamical system and is the object of focus of many learning techniques. However, there are many secondary aspects of dynamical systems—invariant sets, the Koopman operator, and Markov approximations—that provide alternative objectives for learning techniques. Crucially, while many learning methods are focused on the transformation law, we find that forecast performance can depend on how well these other aspects of the dynamics are approximated. These different facets of a dynamical system correspond to objects in completely different spaces—namely, interpolation spaces, compact Hausdorff sets, unitary operators, and Markov operators, respectively. Thus, learning techniques targeting any of these four facets perform different kinds of approximations. We examine whether an approximation of any one of these aspects of the dynamics could lead to an approximation of another facet. Many connections and obstructions are brought to light in this analysis. Special focus is placed on methods of learning the primary feature—the dynamics law itself. The main question considered is the connection between learning this law and reconstructing the Koopman operator and the invariant set. The answers are tied to the ergodic and topological properties of the dynamics, and they reveal how these properties determine the limits of forecasting techniques.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"47 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Featured Review:; Numerical Integration of Differential Equations
IF 10.2 1区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-02-06 DOI: 10.1137/24m1678684
John C. Butcher, Robert M. Corless
SIAM Review, Volume 67, Issue 1, Page 197-204, March 2025.
The book under review was originally published under the auspices of the National Research Council in 1933 (the year John was born), and it was republished as a Dover edition in 1956 (three years before Rob was born). At 108 pages—including title page, preface, table of contents, and index—it’s very short. Even so, it contains a significant amount of information that was of technical importance for its time and is of historical importance now.
{"title":"Featured Review:; Numerical Integration of Differential Equations","authors":"John C. Butcher, Robert M. Corless","doi":"10.1137/24m1678684","DOIUrl":"https://doi.org/10.1137/24m1678684","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 197-204, March 2025. <br/> The book under review was originally published under the auspices of the National Research Council in 1933 (the year John was born), and it was republished as a Dover edition in 1956 (three years before Rob was born). At 108 pages—including title page, preface, table of contents, and index—it’s very short. Even so, it contains a significant amount of information that was of technical importance for its time and is of historical importance now.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"14 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258250","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Research Spotlights
IF 10.2 1区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-02-06 DOI: 10.1137/24m1691442
Stefan M. Wild
SIAM Review, Volume 67, Issue 1, Page 71-71, March 2025.
{"title":"Research Spotlights","authors":"Stefan M. Wild","doi":"10.1137/24m1691442","DOIUrl":"https://doi.org/10.1137/24m1691442","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 71-71, March 2025. <br/>","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"62 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258253","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SIGEST
IF 10.2 1区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-02-06 DOI: 10.1137/24m1691454
The Editors
SIAM Review, Volume 67, Issue 1, Page 105-105, March 2025.
{"title":"SIGEST","authors":"The Editors","doi":"10.1137/24m1691454","DOIUrl":"https://doi.org/10.1137/24m1691454","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 105-105, March 2025. <br/>","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"78 1 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258456","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Graph Neural Networks and Applied Linear Algebra
IF 10.2 1区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-02-06 DOI: 10.1137/23m1609786
Nicholas S. Moore, Eric C. Cyr, Peter Ohm, Christopher M. Siefert, Raymond S. Tuminaro
SIAM Review, Volume 67, Issue 1, Page 141-175, March 2025.
Abstract.Sparse matrix computations are ubiquitous in scientific computing. Given the recent interest in scientific machine learning, it is natural to ask how sparse matrix computations can leverage neural networks (NNs). Unfortunately, multilayer perceptron (MLP) NNs are typically not natural for either graph or sparse matrix computations. The issue lies with the fact that MLPs require fixed-sized inputs, while scientific applications generally generate sparse matrices with arbitrary dimensions and a wide range of different nonzero patterns (or matrix graph vertex interconnections). While convolutional NNs could possibly address matrix graphs where all vertices have the same number of nearest neighbors, a more general approach is needed for arbitrary sparse matrices, e.g., those arising from discretized partial differential equations on unstructured meshes. Graph neural networks (GNNs) are one such approach suitable to sparse matrices. The key idea is to define aggregation functions (e.g., summations) that operate on variable-size input data to produce data of a fixed output size so that MLPs can be applied. The goal of this paper is to provide an introduction to GNNs for a numerical linear algebra audience. Concrete GNN examples are provided to illustrate how many common linear algebra tasks can be accomplished using GNNs. We focus on iterative and multigrid methods that employ computational kernels such as matrix-vector products, interpolation, relaxation methods, and strength-of-connection measures. Our GNN examples include cases where parameters are determined a priori as well as cases where parameters must be learned. The intent of this paper is to help computational scientists understand how GNNs can be used to adapt machine learning concepts to computational tasks associated with sparse matrices. It is hoped that this understanding will further stimulate data-driven extensions of classical sparse linear algebra tasks.
{"title":"Graph Neural Networks and Applied Linear Algebra","authors":"Nicholas S. Moore, Eric C. Cyr, Peter Ohm, Christopher M. Siefert, Raymond S. Tuminaro","doi":"10.1137/23m1609786","DOIUrl":"https://doi.org/10.1137/23m1609786","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 141-175, March 2025. <br/> Abstract.Sparse matrix computations are ubiquitous in scientific computing. Given the recent interest in scientific machine learning, it is natural to ask how sparse matrix computations can leverage neural networks (NNs). Unfortunately, multilayer perceptron (MLP) NNs are typically not natural for either graph or sparse matrix computations. The issue lies with the fact that MLPs require fixed-sized inputs, while scientific applications generally generate sparse matrices with arbitrary dimensions and a wide range of different nonzero patterns (or matrix graph vertex interconnections). While convolutional NNs could possibly address matrix graphs where all vertices have the same number of nearest neighbors, a more general approach is needed for arbitrary sparse matrices, e.g., those arising from discretized partial differential equations on unstructured meshes. Graph neural networks (GNNs) are one such approach suitable to sparse matrices. The key idea is to define aggregation functions (e.g., summations) that operate on variable-size input data to produce data of a fixed output size so that MLPs can be applied. The goal of this paper is to provide an introduction to GNNs for a numerical linear algebra audience. Concrete GNN examples are provided to illustrate how many common linear algebra tasks can be accomplished using GNNs. We focus on iterative and multigrid methods that employ computational kernels such as matrix-vector products, interpolation, relaxation methods, and strength-of-connection measures. Our GNN examples include cases where parameters are determined a priori as well as cases where parameters must be learned. The intent of this paper is to help computational scientists understand how GNNs can be used to adapt machine learning concepts to computational tasks associated with sparse matrices. It is hoped that this understanding will further stimulate data-driven extensions of classical sparse linear algebra tasks.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"40 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258251","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Book Review:; Probability Adventures
IF 10.2 1区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-02-06 DOI: 10.1137/24m1646108
Nevena Marić
SIAM Review, Volume 67, Issue 1, Page 205-206, March 2025.
The first look at Probability Adventures brought back memories of a conference in Ubatuba, Brazil, in 2001, where as a young Master’s student I worried that true science had to be deadly serious. Fortunately, several inspiring teachers came to the rescue. Andrei Toom’s words resonated deeply with me when he began his lecture by saying, “Every mathematician is a big child.” The esteemed audience beamed with approval. Today, I look at Probability Adventures and applaud Mark Huber for honoring the child in all of us and offering a reading that is both fun and mathematically rigorous.
{"title":"Book Review:; Probability Adventures","authors":"Nevena Marić","doi":"10.1137/24m1646108","DOIUrl":"https://doi.org/10.1137/24m1646108","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 205-206, March 2025. <br/> The first look at Probability Adventures brought back memories of a conference in Ubatuba, Brazil, in 2001, where as a young Master’s student I worried that true science had to be deadly serious. Fortunately, several inspiring teachers came to the rescue. Andrei Toom’s words resonated deeply with me when he began his lecture by saying, “Every mathematician is a big child.” The esteemed audience beamed with approval. Today, I look at Probability Adventures and applaud Mark Huber for honoring the child in all of us and offering a reading that is both fun and mathematically rigorous.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"45 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258248","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Book Review:; Elegant Simulations. From Simple Oscillators to Many-Body Systems
IF 10.2 1区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-02-06 DOI: 10.1137/24m1690953
Omar Morandi
SIAM Review, Volume 67, Issue 1, Page 207-208, March 2025.
Elegant Simulations covers various aspects of modeling and simulating mechanical systems described at the elementary level by many-interacting particles. The book presents the topics from an original and fresh point of view. The complex many-body dynamics is reproduced at the elementary level in terms of simple models that are easy to understand and interpret. The principal benefit for the reader is that this approach helps to develop an intuitive picture of the complex many-body dynamics.
{"title":"Book Review:; Elegant Simulations. From Simple Oscillators to Many-Body Systems","authors":"Omar Morandi","doi":"10.1137/24m1690953","DOIUrl":"https://doi.org/10.1137/24m1690953","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 207-208, March 2025. <br/> Elegant Simulations covers various aspects of modeling and simulating mechanical systems described at the elementary level by many-interacting particles. The book presents the topics from an original and fresh point of view. The complex many-body dynamics is reproduced at the elementary level in terms of simple models that are easy to understand and interpret. The principal benefit for the reader is that this approach helps to develop an intuitive picture of the complex many-body dynamics.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"11 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258246","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
SIAM Review
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1