{"title":"Survey and Review","authors":"Marlis Hochbruck","doi":"10.1137/23n975673","DOIUrl":null,"url":null,"abstract":"A point process is called self-exciting if the arrival of an event increases the probability of similar events for some period of time. Typical examples include earthquakes, which frequently cause aftershocks due to increased geological tension in their region; raised intrusion rates in the vicinity of a burglary; retweets in social media incited by some provocative posting; or trading frenzies following a huge stock order. A Hawkes process is a point process that models self-excitement among time events. In contrast to a Markov chain (in which the probability of each event depends only on the state attained in the previous event), chances of arrival of events are increased for some time period after the initial arrival in a Hawkes process. The first Survey and Review paper in this issue, “Hawkes Processes Modeling, Inference, and Control: An Overview,” by Rafael Lima, discusses recent advances in Hawkes process modeling and inference. The parametric, nonparametric, deep learning, and reinforcement learning approaches are covered. Current research challenges for the topic and the real-world limitations of each approach are also addressed. The paper should be of interest to experts in the field, but it also aims to be suitable for newcomers. The second Survey and Review paper, “Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists,” by Laurent Condat, Daichi Kitahara, Andrés Contreras, and Akira Hirabayashi, is dedicated to the solution of convex nonsmooth optimization problems in high-dimensional spaces. The objective function $f$ is assumed to be a sum of simple convex functions $f_j$ with the property that the minimization problem for each $f_j$ is simple, but for $f$ it is hard. For nonsmooth functions, gradient-based optimization algorithms are infeasible. In proximal algorithms, the gradient is replaced by the so-called proximity operator. While closed forms of proximity operators are known for many functions of practical interest, there is no general closed form for the proximity operator of a sum of functions. Therefore, splitting algorithms handle the proximity operators of the functions $f_j$ individually. The paper provides a constructive and self-contained introduction to the class of proximal splitting algorithms. New variants of the algorithms under consideration are developed. Existing convergence results are revisited, unified, and, in some cases, improved. Reading the paper will be rewarding for anyone interested in high-dimensional nonsmooth convex optimization.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"21 1","pages":"0"},"PeriodicalIF":10.8000,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1137/23n975673","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
A point process is called self-exciting if the arrival of an event increases the probability of similar events for some period of time. Typical examples include earthquakes, which frequently cause aftershocks due to increased geological tension in their region; raised intrusion rates in the vicinity of a burglary; retweets in social media incited by some provocative posting; or trading frenzies following a huge stock order. A Hawkes process is a point process that models self-excitement among time events. In contrast to a Markov chain (in which the probability of each event depends only on the state attained in the previous event), chances of arrival of events are increased for some time period after the initial arrival in a Hawkes process. The first Survey and Review paper in this issue, “Hawkes Processes Modeling, Inference, and Control: An Overview,” by Rafael Lima, discusses recent advances in Hawkes process modeling and inference. The parametric, nonparametric, deep learning, and reinforcement learning approaches are covered. Current research challenges for the topic and the real-world limitations of each approach are also addressed. The paper should be of interest to experts in the field, but it also aims to be suitable for newcomers. The second Survey and Review paper, “Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists,” by Laurent Condat, Daichi Kitahara, Andrés Contreras, and Akira Hirabayashi, is dedicated to the solution of convex nonsmooth optimization problems in high-dimensional spaces. The objective function $f$ is assumed to be a sum of simple convex functions $f_j$ with the property that the minimization problem for each $f_j$ is simple, but for $f$ it is hard. For nonsmooth functions, gradient-based optimization algorithms are infeasible. In proximal algorithms, the gradient is replaced by the so-called proximity operator. While closed forms of proximity operators are known for many functions of practical interest, there is no general closed form for the proximity operator of a sum of functions. Therefore, splitting algorithms handle the proximity operators of the functions $f_j$ individually. The paper provides a constructive and self-contained introduction to the class of proximal splitting algorithms. New variants of the algorithms under consideration are developed. Existing convergence results are revisited, unified, and, in some cases, improved. Reading the paper will be rewarding for anyone interested in high-dimensional nonsmooth convex optimization.
期刊介绍:
Survey and Review feature papers that provide an integrative and current viewpoint on important topics in applied or computational mathematics and scientific computing. These papers aim to offer a comprehensive perspective on the subject matter.
Research Spotlights publish concise research papers in applied and computational mathematics that are of interest to a wide range of readers in SIAM Review. The papers in this section present innovative ideas that are clearly explained and motivated. They stand out from regular publications in specific SIAM journals due to their accessibility and potential for widespread and long-lasting influence.