Pub Date : 2024-06-04DOI: 10.1007/s10712-024-09844-w
T. F. Stocker, R. G. Jones, M. I. Hegglin, T. M. Lenton, G. C. Hegerl, S. I. Seneviratne, N. van der Wel, R. A. Wood
There is a diverging perception of climate tipping points, abrupt changes and surprises in the scientific community and the public. While such dynamics have been observed in the past, e.g., frequent reductions of the Atlantic meridional overturning circulation during the last ice age, or ice sheet collapses, tipping points might also be a possibility in an anthropogenically perturbed climate. In this context, high impact—low likelihood events, both in the physical realm as well as in ecosystems, will be potentially dangerous. Here we argue that a formalized assessment of the state of science is needed in order to establish a consensus on this issue and to reconcile diverging views. This has been the approach taken by the Intergovernmental Panel on Climate Change (IPCC). Since 1990, the IPCC has consistently generated robust consensus on several complex issues, ranging from the detection and attribution of climate change, the global carbon budget and climate sensitivity, to the projection of extreme events and their impact. Here, we suggest that a scientific assessment on tipping points, conducted collaboratively by the IPCC and the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services, would represent an ambitious yet necessary goal to be accomplished within the next decade.
{"title":"Reflecting on the Science of Climate Tipping Points to Inform and Assist Policy Making and Address the Risks they Pose to Society","authors":"T. F. Stocker, R. G. Jones, M. I. Hegglin, T. M. Lenton, G. C. Hegerl, S. I. Seneviratne, N. van der Wel, R. A. Wood","doi":"10.1007/s10712-024-09844-w","DOIUrl":"https://doi.org/10.1007/s10712-024-09844-w","url":null,"abstract":"<p>There is a diverging perception of climate tipping points, abrupt changes and surprises in the scientific community and the public. While such dynamics have been observed in the past, e.g., frequent reductions of the Atlantic meridional overturning circulation during the last ice age, or ice sheet collapses, tipping points might also be a possibility in an anthropogenically perturbed climate. In this context, high impact—low likelihood events, both in the physical realm as well as in ecosystems, will be potentially dangerous. Here we argue that a formalized assessment of the state of science is needed in order to establish a consensus on this issue and to reconcile diverging views. This has been the approach taken by the Intergovernmental Panel on Climate Change (IPCC). Since 1990, the IPCC has consistently generated robust consensus on several complex issues, ranging from the detection and attribution of climate change, the global carbon budget and climate sensitivity, to the projection of extreme events and their impact. Here, we suggest that a scientific assessment on tipping points, conducted collaboratively by the IPCC and the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services, would represent an ambitious yet necessary goal to be accomplished within the next decade.</p>","PeriodicalId":49458,"journal":{"name":"Surveys in Geophysics","volume":"72 1","pages":""},"PeriodicalIF":4.6,"publicationDate":"2024-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141246343","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-31DOI: 10.1007/s10712-024-09831-1
Graeme L. Stephens, Kathleen A. Shiro, Maria Z. Hakuba, Hanii Takahashi, Juliet A. Pilewskie, Timothy Andrews, Claudia J. Stubenrauch, Longtao Wu
This paper is concerned with how the diabatically-forced overturning circulations of the atmosphere, established by the deep convection within the tropical trough zone (TTZ), first introduced by Riehl and (Malkus) Simpson, in Contr Atmos Phys 52:287–305 (1979), fundamentally shape the distributions of tropical and subtropical cloudiness and the changes to cloudiness as Earth warms. The study first draws on an analysis of a range of observations to understand the connections between the energetics of the TTZ, convection and clouds. These observations reveal a tight coupling of the two main components of the diabatic heating, the cloud component of radiative heating, shaped mostly by high clouds formed by deep convection, and the latent heating associated with the precipitation. Interannual variability of the TTZ reveals a marked variation that connects the depth of the tropical troposphere, the depth of convection, the thickness of high clouds and the TOA radiative imbalance. The study examines connections between this convective zone and cloud changes further afield in the context of CMIP6 model experiments of climate warming. The warming realized in the CMIP6 SSP5-8.5 scenario multi-model experiments, for example, produces an enhanced Hadley circulation with increased heating in the zone of tropical deep convection and increased radiative cooling and subsidence in the subtropical regions. This impacts low cloud changes and in turn the model warming response through low cloud feedbacks. The pattern of warming produced by models, also influenced by convection in the tropical region, has a profound influence on the projected global warming.
{"title":"Tropical Deep Convection, Cloud Feedbacks and Climate Sensitivity","authors":"Graeme L. Stephens, Kathleen A. Shiro, Maria Z. Hakuba, Hanii Takahashi, Juliet A. Pilewskie, Timothy Andrews, Claudia J. Stubenrauch, Longtao Wu","doi":"10.1007/s10712-024-09831-1","DOIUrl":"10.1007/s10712-024-09831-1","url":null,"abstract":"<div><p>This paper is concerned with how the diabatically-forced overturning circulations of the atmosphere, established by the deep convection within the tropical trough zone (TTZ), first introduced by Riehl and (Malkus) Simpson, in Contr Atmos Phys 52:287–305 (1979), fundamentally shape the distributions of tropical and subtropical cloudiness and the changes to cloudiness as Earth warms. The study first draws on an analysis of a range of observations to understand the connections between the energetics of the TTZ, convection and clouds. These observations reveal a tight coupling of the two main components of the diabatic heating, the cloud component of radiative heating, shaped mostly by high clouds formed by deep convection, and the latent heating associated with the precipitation. Interannual variability of the TTZ reveals a marked variation that connects the depth of the tropical troposphere, the depth of convection, the thickness of high clouds and the TOA radiative imbalance. The study examines connections between this convective zone and cloud changes further afield in the context of CMIP6 model experiments of climate warming. The warming realized in the CMIP6 SSP5-8.5 scenario multi-model experiments, for example, produces an enhanced Hadley circulation with increased heating in the zone of tropical deep convection and increased radiative cooling and subsidence in the subtropical regions. This impacts low cloud changes and in turn the model warming response through low cloud feedbacks. The pattern of warming produced by models, also influenced by convection in the tropical region, has a profound influence on the projected global warming.</p></div>","PeriodicalId":49458,"journal":{"name":"Surveys in Geophysics","volume":"45 6","pages":"1903 - 1931"},"PeriodicalIF":4.9,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10712-024-09831-1.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141182373","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rayleigh wave exploration is a powerful method for estimating near-surface shear-wave (S-wave) velocities, providing valuable insights into the stiffness properties of subsurface materials inside the Earth. The dispersion curve inversion of Rayleigh wave corresponds to the optimization process of searching for the optimal solutions of earth model parameters based on the measured dispersion curves. At present, diversified inversion algorithms have been introduced into the process of Rayleigh wave inversion. However, limited studies have been conducted to uncover the variations in inversion performance among commonly used inversion algorithms. To obtain a comprehensive understanding of the optimization performance of these inversion algorithms, we systematically investigate and quantitatively assess the inversion performance of two bionic algorithms, two probabilistic algorithms, a gradient-based algorithm, and two neural network algorithms. The evaluation indices include the computational cost, accuracy, stability, generalization ability, noise effects, and field data processing capability. It is found that the Bound-constrained limited-memory Broyden–Fletcher–Goldfarb–Shanno (L-BFGS-B) algorithm and the broad learning (BL) network have the lowest computational cost among candidate algorithms. Furthermore, the transitional Markov Chain Monte Carlo algorithm, deep learning (DL) network, and BL network outperform the other four algorithms regarding accuracy, stability, resistance to noise effects, and capability to process field data. The DL and BL networks demonstrate the highest level of generalization compared to the other algorithms. The comparison results reveal the variations in candidate algorithms for the inversion task, causing a clear understanding of the inversion performance of candidate algorithms. This study can promote the S-wave velocity estimation by Rayleigh wave inversion.
瑞利波探测是一种估算近地表剪切波(S 波)速度的强大方法,可为了解地球内部地下材料的刚度特性提供宝贵的信息。雷利波频散曲线反演相当于根据测得的频散曲线寻找地球模型参数最优解的优化过程。目前,已有多种反演算法被引入到瑞利波反演过程中。然而,对常用反演算法之间反演性能差异的研究还很有限。为了全面了解这些反演算法的优化性能,我们对两种仿生算法、两种概率算法、一种基于梯度的算法和两种神经网络算法的反演性能进行了系统研究和定量评估。评价指标包括计算成本、精度、稳定性、泛化能力、噪声影响和现场数据处理能力。结果发现,在候选算法中,有界约束的有限内存 Broyden-Fletcher-Goldfarb-Shanno 算法(L-BFGS-B)和广义学习(BL)网络的计算成本最低。此外,过渡马尔可夫链蒙特卡洛算法、深度学习(DL)网络和广义学习(BL)网络在准确性、稳定性、抗噪声影响和处理现场数据的能力方面都优于其他四种算法。与其他算法相比,DL 和 BL 网络的泛化程度最高。比较结果揭示了反演任务中候选算法的差异,使人们对候选算法的反演性能有了清晰的认识。这项研究可促进通过瑞利波反演估算 S 波速度。
{"title":"Near-Surface Rayleigh Wave Dispersion Curve Inversion Algorithms: A Comprehensive Comparison","authors":"Xiao-Hui Yang, Yuanyuan Zhou, Peng Han, Xuping Feng, Xiaofei Chen","doi":"10.1007/s10712-024-09826-y","DOIUrl":"10.1007/s10712-024-09826-y","url":null,"abstract":"<div><p>Rayleigh wave exploration is a powerful method for estimating near-surface shear-wave (S-wave) velocities, providing valuable insights into the stiffness properties of subsurface materials inside the Earth. The dispersion curve inversion of Rayleigh wave corresponds to the optimization process of searching for the optimal solutions of earth model parameters based on the measured dispersion curves. At present, diversified inversion algorithms have been introduced into the process of Rayleigh wave inversion. However, limited studies have been conducted to uncover the variations in inversion performance among commonly used inversion algorithms. To obtain a comprehensive understanding of the optimization performance of these inversion algorithms, we systematically investigate and quantitatively assess the inversion performance of two bionic algorithms, two probabilistic algorithms, a gradient-based algorithm, and two neural network algorithms. The evaluation indices include the computational cost, accuracy, stability, generalization ability, noise effects, and field data processing capability. It is found that the Bound-constrained limited-memory Broyden–Fletcher–Goldfarb–Shanno (L-BFGS-B) algorithm and the broad learning (BL) network have the lowest computational cost among candidate algorithms. Furthermore, the transitional Markov Chain Monte Carlo algorithm, deep learning (DL) network, and BL network outperform the other four algorithms regarding accuracy, stability, resistance to noise effects, and capability to process field data. The DL and BL networks demonstrate the highest level of generalization compared to the other algorithms. The comparison results reveal the variations in candidate algorithms for the inversion task, causing a clear understanding of the inversion performance of candidate algorithms. This study can promote the S-wave velocity estimation by Rayleigh wave inversion.</p></div>","PeriodicalId":49458,"journal":{"name":"Surveys in Geophysics","volume":"45 3","pages":"773 - 818"},"PeriodicalIF":4.9,"publicationDate":"2024-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10712-024-09826-y.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141074295","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-21DOI: 10.1007/s10712-024-09834-y
Jan Horák, Richard Hewitt, Julien Thiesson, Roman Křivánek, Alžběta Danielisová, Martin Janovský
Integration of different kinds of data is an important issue in archaeological prospection. However, the current methodological approaches are underdeveloped and rarely use the data to their maximum potential. Common approaches to integration in the geophysical sciences are mostly just various forms of comparison. We argue that true integration should involve the mathematical manipulation of input data such that the original values of the input data are changed, or that new variables are produced. To address this important research gap, we present an innovative approach to the analysis of geochemical and geophysical datasets in prospection-focused disciplines. Our approach, which we refer to as “multiscalar integration” to differentiate it from simpler methods, involves the application of mathematical methods and tools to process the data in a unified way. To demonstrate our approach, we focus on integrating geophysical data (magnetometry) with geochemical data (elemental content). Our approach comprises three main stages: Quantification of the data deviation from random distributions, linear modelling of geophysical and geochemical data and integration based on weighting of the different elements derived in previous steps. All the steps of the workflow can be also applied separately and independently as needed or preferred. Our approach is implemented in the R environment for statistical computing. All data, functions and scripts used in the work are available from open access repositories (Zenodo.org and Github.com) so that others can test, modify and apply our proposed methods to new cases and problems. Our approach has the following advantages: (1) It allows the rapid exploration of multiple data sources in an unified way; (2) it can increase the utility of geochemical data across diverse prospection disciplines; (3) it facilitates the identification of links between geochemical and geophysical data (or generally, between point-based and raster data); (4) it innovatively integrates various datasets by weighting the information provided by each; (5) it is simple to apply following a step-by-step framework; (6) the code and workflow is fully open to allow for customization, improvements and additions.
{"title":"Multiscalar Integration of Dense and Sparse Spatial Data: an Archaeological Case Study with Magnetometry and Geochemistry","authors":"Jan Horák, Richard Hewitt, Julien Thiesson, Roman Křivánek, Alžběta Danielisová, Martin Janovský","doi":"10.1007/s10712-024-09834-y","DOIUrl":"10.1007/s10712-024-09834-y","url":null,"abstract":"<div><p>Integration of different kinds of data is an important issue in archaeological prospection. However, the current methodological approaches are underdeveloped and rarely use the data to their maximum potential. Common approaches to integration in the geophysical sciences are mostly just various forms of comparison. We argue that true integration should involve the mathematical manipulation of input data such that the original values of the input data are changed, or that new variables are produced. To address this important research gap, we present an innovative approach to the analysis of geochemical and geophysical datasets in prospection-focused disciplines. Our approach, which we refer to as “multiscalar integration” to differentiate it from simpler methods, involves the application of mathematical methods and tools to process the data in a unified way. To demonstrate our approach, we focus on integrating geophysical data (magnetometry) with geochemical data (elemental content). Our approach comprises three main stages: Quantification of the data deviation from random distributions, linear modelling of geophysical and geochemical data and integration based on weighting of the different elements derived in previous steps. All the steps of the workflow can be also applied separately and independently as needed or preferred. Our approach is implemented in the <i>R</i> environment for statistical computing. All data, functions and scripts used in the work are available from open access repositories (Zenodo.org and Github.com) so that others can test, modify and apply our proposed methods to new cases and problems. Our approach has the following advantages: (1) It allows the rapid exploration of multiple data sources in an unified way; (2) it can increase the utility of geochemical data across diverse prospection disciplines; (3) it facilitates the identification of links between geochemical and geophysical data (or generally, between point-based and raster data); (4) it innovatively integrates various datasets by weighting the information provided by each; (5) it is simple to apply following a step-by-step framework; (6) the code and workflow is fully open to allow for customization, improvements and additions.</p></div>","PeriodicalId":49458,"journal":{"name":"Surveys in Geophysics","volume":"45 4","pages":"1011 - 1045"},"PeriodicalIF":4.9,"publicationDate":"2024-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141074112","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-20DOI: 10.1007/s10712-024-09837-9
Shijun Cheng, Randy Harsuko, Tariq Alkhalifah
Machine learning-based seismic processing models are typically trained separately to perform seismic processing tasks (SPTs) and, as a result, require plenty of high-quality training data. However, preparing training data sets is not trivial, especially for supervised learning (SL). Despite the variability in seismic data across different types and regions, some general characteristics are shared, such as their sinusoidal nature and geometric texture. To learn the shared features and thus, quickly adapt to various SPTs, we develop a unified paradigm for neural network-based seismic processing, called Meta-Processing, that uses limited training data for meta learning a common network initialization, which offers universal adaptability features. The proposed Meta-Processing framework consists of two stages: meta-training and meta-testing. In the former, each SPT is treated as a separate task and the training dataset is divided into support and query sets. Unlike conventional SL methods, here, the neural network (NN) parameters are updated by a bilevel gradient descent from the support set to the query set, iterating through all tasks. In the meta-testing stage, we also utilize limited data to fine-tune the optimized NN parameters in an SL fashion to conduct various SPTs, such as denoising, interpolation, ground-roll attenuation, image enhancement, and velocity estimation, aiming to converge quickly to ideal performance. Extensive numerical experiments are conducted to assess the effectiveness of Meta-Processing on both synthetic and real-world data. The findings reveal that our approach leads to a substantial improvement in the convergence speed and predictive performance of the NN.
{"title":"Meta-Processing: A robust framework for multi-tasks seismic processing","authors":"Shijun Cheng, Randy Harsuko, Tariq Alkhalifah","doi":"10.1007/s10712-024-09837-9","DOIUrl":"10.1007/s10712-024-09837-9","url":null,"abstract":"<div><p>Machine learning-based seismic processing models are typically trained separately to perform seismic processing tasks (SPTs) and, as a result, require plenty of high-quality training data. However, preparing training data sets is not trivial, especially for supervised learning (SL). Despite the variability in seismic data across different types and regions, some general characteristics are shared, such as their sinusoidal nature and geometric texture. To learn the shared features and thus, quickly adapt to various SPTs, we develop a unified paradigm for neural network-based seismic processing, called Meta-Processing, that uses limited training data for meta learning a common network initialization, which offers universal adaptability features. The proposed Meta-Processing framework consists of two stages: meta-training and meta-testing. In the former, each SPT is treated as a separate task and the training dataset is divided into support and query sets. Unlike conventional SL methods, here, the neural network (NN) parameters are updated by a bilevel gradient descent from the support set to the query set, iterating through all tasks. In the meta-testing stage, we also utilize limited data to fine-tune the optimized NN parameters in an SL fashion to conduct various SPTs, such as denoising, interpolation, ground-roll attenuation, image enhancement, and velocity estimation, aiming to converge quickly to ideal performance. Extensive numerical experiments are conducted to assess the effectiveness of Meta-Processing on both synthetic and real-world data. The findings reveal that our approach leads to a substantial improvement in the convergence speed and predictive performance of the NN.</p></div>","PeriodicalId":49458,"journal":{"name":"Surveys in Geophysics","volume":"45 4","pages":"1081 - 1116"},"PeriodicalIF":4.9,"publicationDate":"2024-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141074110","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-10DOI: 10.1007/s10712-024-09839-7
Carla Braitenberg, Alberto Pastorutti
Seamount eruptions alter the bathymetry and can occur undetected due to lack of explosive character. We review documented eruptions to define whether they could be detected by a future satellite gravity mission. We adopt the noise level in acquisitions of multi-satellite constellations as in the MOCAST+ study, with a proposed payload of a quantum technology gradiometer and clock. The review of underwater volcanoes includes the Hunga Tonga Hunga Ha’apai (HTHH) islands for which the exposed surface changed during volcanic unrests of 2014/2015 and 2021/2022. The Fani Maoré submarine volcanic eruption of 2018–2021 produced a new seamount 800 m high, emerging from a depth of 3500 m, and therefore not seen above sea surface. We review further documented submarine eruptions and estimate the upper limit of the expected gravity changes. We find that a MOCAST+ type mission should allow us to detect the subsurface mass changes generated by deep ocean submarine volcanic activity for volume changes of 6.5 km3 upwards, with latency of 1 year. This change is met by the HTHH and Fani Maoré volcanoes.
{"title":"Detectability of Seamount Eruptions Through a Quantum Technology Gravity Mission MOCAST+: Hunga Tonga, Fani Maoré and Other Smaller Eruptions","authors":"Carla Braitenberg, Alberto Pastorutti","doi":"10.1007/s10712-024-09839-7","DOIUrl":"10.1007/s10712-024-09839-7","url":null,"abstract":"<div><p>Seamount eruptions alter the bathymetry and can occur undetected due to lack of explosive character. We review documented eruptions to define whether they could be detected by a future satellite gravity mission. We adopt the noise level in acquisitions of multi-satellite constellations as in the MOCAST+ study, with a proposed payload of a quantum technology gradiometer and clock. The review of underwater volcanoes includes the Hunga Tonga Hunga Ha’apai (HTHH) islands for which the exposed surface changed during volcanic unrests of 2014/2015 and 2021/2022. The Fani Maoré submarine volcanic eruption of 2018–2021 produced a new seamount 800 m high, emerging from a depth of 3500 m, and therefore not seen above sea surface. We review further documented submarine eruptions and estimate the upper limit of the expected gravity changes. We find that a MOCAST+ type mission should allow us to detect the subsurface mass changes generated by deep ocean submarine volcanic activity for volume changes of 6.5 km<sup>3</sup> upwards, with latency of 1 year. This change is met by the HTHH and Fani Maoré volcanoes.</p></div>","PeriodicalId":49458,"journal":{"name":"Surveys in Geophysics","volume":"45 4","pages":"1331 - 1361"},"PeriodicalIF":4.9,"publicationDate":"2024-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10712-024-09839-7.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140903017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-07DOI: 10.1007/s10712-024-09838-8
Norman G. Loeb, Seung-Hee Ham, Richard P. Allan, Tyler J. Thorsen, Benoit Meyssignac, Seiji Kato, Gregory C. Johnson, John M. Lyman
Satellite observations from the Clouds and the Earth’s Radiant Energy System show that Earth’s energy imbalance has doubled from 0.5 ± 0.2 Wm−2 during the first 10 years of this century to 1.0 ± 0.2 Wm−2 during the past decade. The increase is the result of a 0.9 ± 0.3 Wm−2 increase absorbed solar radiation (ASR) that is partially offset by a 0.4 ± 0.25 Wm−2 increase in outgoing longwave radiation (OLR). Despite marked differences in ASR and OLR trends during the hiatus (2000–2010), transition-to-El Niño (2010–2016) and post-El Niño (2016–2022) periods, trends in net top-of-atmosphere flux (NET) remain within 0.1 Wm−2 per decade of one another, implying a steady acceleration of climate warming. Northern and southern hemisphere trends in NET are consistent to 0.06 ± 0.31 Wm−2 per decade due to a compensation between weak ASR and OLR hemispheric trend differences of opposite sign. We find that large decreases in stratocumulus and middle clouds over the sub-tropics and decreases in low and middle clouds at mid-latitudes are the primary reasons for increasing ASR trends in the northern hemisphere (NH). These changes are especially large over the eastern and northern Pacific Ocean, and coincide with large increases in sea-surface temperature (SST). The decrease in cloud fraction and higher SSTs over the NH sub-tropics lead to a significant increase in OLR from cloud-free regions, which partially compensate for the NH ASR increase. Decreases in middle cloud reflection and a weaker reduction in low-cloud reflection account for the increase in ASR in the southern hemisphere, while OLR changes are weak. Changes in cloud cover in response to SST increases imply a feedback to climate change yet a contribution from radiative forcing or internal variability cannot be ruled out.
{"title":"Observational Assessment of Changes in Earth’s Energy Imbalance Since 2000","authors":"Norman G. Loeb, Seung-Hee Ham, Richard P. Allan, Tyler J. Thorsen, Benoit Meyssignac, Seiji Kato, Gregory C. Johnson, John M. Lyman","doi":"10.1007/s10712-024-09838-8","DOIUrl":"10.1007/s10712-024-09838-8","url":null,"abstract":"<div><p>Satellite observations from the Clouds and the Earth’s Radiant Energy System show that Earth’s energy imbalance has doubled from 0.5 ± 0.2 Wm<sup>−2</sup> during the first 10 years of this century to 1.0 ± 0.2 Wm<sup>−</sup><sup>2</sup> during the past decade. The increase is the result of a 0.9 ± 0.3 Wm<sup>−2</sup> increase absorbed solar radiation (ASR) that is partially offset by a 0.4 ± 0.25 Wm<sup>−2</sup> increase in outgoing longwave radiation (OLR). Despite marked differences in ASR and OLR trends during the hiatus (2000–2010), transition-to-El Niño (2010–2016) and post-El Niño (2016–2022) periods, trends in net top-of-atmosphere flux (NET) remain within 0.1 Wm<sup>−2</sup> per decade of one another, implying a steady acceleration of climate warming. Northern and southern hemisphere trends in NET are consistent to 0.06 ± 0.31 Wm<sup>−2</sup> per decade due to a compensation between weak ASR and OLR hemispheric trend differences of opposite sign. We find that large decreases in stratocumulus and middle clouds over the sub-tropics and decreases in low and middle clouds at mid-latitudes are the primary reasons for increasing ASR trends in the northern hemisphere (NH). These changes are especially large over the eastern and northern Pacific Ocean, and coincide with large increases in sea-surface temperature (SST). The decrease in cloud fraction and higher SSTs over the NH sub-tropics lead to a significant increase in OLR from cloud-free regions, which partially compensate for the NH ASR increase. Decreases in middle cloud reflection and a weaker reduction in low-cloud reflection account for the increase in ASR in the southern hemisphere, while OLR changes are weak. Changes in cloud cover in response to SST increases imply a feedback to climate change yet a contribution from radiative forcing or internal variability cannot be ruled out.</p></div>","PeriodicalId":49458,"journal":{"name":"Surveys in Geophysics","volume":"45 6","pages":"1757 - 1783"},"PeriodicalIF":4.9,"publicationDate":"2024-05-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10712-024-09838-8.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140845199","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-03DOI: 10.1007/s10712-023-09818-4
Michal Šprlák, Martin Pitoňák
Integral transformations represent an important mathematical tool for gravitational field modelling. A basic assumption of integral transformations is the global data coverage, but availability of high-resolution and accurate gravitational data may be restricted. Therefore, we decompose the global integration into two parts: (1) the effect of the near zone calculated by the numerical integration of data within a spherical cap and (2) the effect of the far zone due to data beyond the spherical cap synthesised by harmonic expansions. Theoretical and numerical aspects of this decomposition have frequently been studied for isotropic integral transformations on the sphere, such as Hotine’s, Poisson’s, and Stokes’s integral formulas. In this article, we systematically review the mathematical theory of the far-zone effects for the spherical integral formulas, which transform the disturbing gravitational potential or its purely radial derivatives into observable quantities of the gravitational field, i.e. the disturbing gravitational potential and its radial, horizontal, or mixed derivatives of the first, second, or third order. These formulas are implemented in a MATLAB software and validated in a closed-loop simulation. Selected properties of the harmonic expansions are investigated by examining the behaviour of the truncation error coefficients. The mathematical formulations presented here are indispensable for practical solutions of direct or inverse problems in an accurate gravitational field modelling or when studying statistical properties of integral transformations.
{"title":"Far-Zone Effects for Spherical Integral Transformations I: Formulas for the Radial Boundary Value Problem and its Derivatives","authors":"Michal Šprlák, Martin Pitoňák","doi":"10.1007/s10712-023-09818-4","DOIUrl":"10.1007/s10712-023-09818-4","url":null,"abstract":"<div><p>Integral transformations represent an important mathematical tool for gravitational field modelling. A basic assumption of integral transformations is the global data coverage, but availability of high-resolution and accurate gravitational data may be restricted. Therefore, we decompose the global integration into two parts: (1) the effect of the near zone calculated by the numerical integration of data within a spherical cap and (2) the effect of the far zone due to data beyond the spherical cap synthesised by harmonic expansions. Theoretical and numerical aspects of this decomposition have frequently been studied for isotropic integral transformations on the sphere, such as Hotine’s, Poisson’s, and Stokes’s integral formulas. In this article, we systematically review the mathematical theory of the far-zone effects for the spherical integral formulas, which transform the disturbing gravitational potential or its purely radial derivatives into observable quantities of the gravitational field, i.e. the disturbing gravitational potential and its radial, horizontal, or mixed derivatives of the first, second, or third order. These formulas are implemented in a MATLAB software and validated in a closed-loop simulation. Selected properties of the harmonic expansions are investigated by examining the behaviour of the truncation error coefficients. The mathematical formulations presented here are indispensable for practical solutions of direct or inverse problems in an accurate gravitational field modelling or when studying statistical properties of integral transformations.</p></div>","PeriodicalId":49458,"journal":{"name":"Surveys in Geophysics","volume":"45 3","pages":"977 - 1009"},"PeriodicalIF":4.9,"publicationDate":"2024-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10712-023-09818-4.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140845016","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-03DOI: 10.1007/s10712-024-09833-z
Sebastian Bathiany, Robbin Bastiaansen, Ana Bastos, Lana Blaschke, Jelle Lever, Sina Loriani, Wanda De Keersmaecker, Wouter Dorigo, Milutin Milenković, Cornelius Senf, Taylor Smith, Jan Verbesselt, Niklas Boers
As the Earth system is exposed to large anthropogenic interferences, it becomes ever more important to assess the resilience of natural systems, i.e., their ability to recover from natural and human-induced perturbations. Several, often related, measures of resilience have been proposed and applied to modeled and observed data, often by different scientific communities. Focusing on terrestrial ecosystems as a key component of the Earth system, we review methods that can detect large perturbations (temporary excursions from a reference state as well as abrupt shifts to a new reference state) in spatio-temporal datasets, estimate the recovery rate after such perturbations, or assess resilience changes indirectly from stationary time series via indicators of critical slowing down. We present here a sequence of ideal methodological steps in the field of resilience science, and argue how to obtain a consistent and multi-faceted view on ecosystem or climate resilience from Earth observation (EO) data. While EO data offers unique potential to study ecosystem resilience globally at high spatial and temporal scale, we emphasize some important limitations, which are associated with the theoretical assumptions behind diagnostic methods and with the measurement process and pre-processing steps of EO data. The latter class of limitations include gaps in time series, the disparity of scales, and issues arising from aggregating time series from multiple sensors. Based on this assessment, we formulate specific recommendations to the EO community in order to improve the observational basis for ecosystem resilience research.
{"title":"Ecosystem Resilience Monitoring and Early Warning Using Earth Observation Data: Challenges and Outlook","authors":"Sebastian Bathiany, Robbin Bastiaansen, Ana Bastos, Lana Blaschke, Jelle Lever, Sina Loriani, Wanda De Keersmaecker, Wouter Dorigo, Milutin Milenković, Cornelius Senf, Taylor Smith, Jan Verbesselt, Niklas Boers","doi":"10.1007/s10712-024-09833-z","DOIUrl":"https://doi.org/10.1007/s10712-024-09833-z","url":null,"abstract":"<p>As the Earth system is exposed to large anthropogenic interferences, it becomes ever more important to assess the resilience of natural systems, i.e., their ability to recover from natural and human-induced perturbations. Several, often related, measures of resilience have been proposed and applied to modeled and observed data, often by different scientific communities. Focusing on terrestrial ecosystems as a key component of the Earth system, we review methods that can detect large perturbations (temporary excursions from a reference state as well as abrupt shifts to a new reference state) in spatio-temporal datasets, estimate the recovery rate after such perturbations, or assess resilience changes indirectly from stationary time series via indicators of critical slowing down. We present here a sequence of ideal methodological steps in the field of resilience science, and argue how to obtain a consistent and multi-faceted view on ecosystem or climate resilience from Earth observation (EO) data. While EO data offers unique potential to study ecosystem resilience globally at high spatial and temporal scale, we emphasize some important limitations, which are associated with the theoretical assumptions behind diagnostic methods and with the measurement process and pre-processing steps of EO data. The latter class of limitations include gaps in time series, the disparity of scales, and issues arising from aggregating time series from multiple sensors. Based on this assessment, we formulate specific recommendations to the EO community in order to improve the observational basis for ecosystem resilience research.</p>","PeriodicalId":49458,"journal":{"name":"Surveys in Geophysics","volume":"107 1","pages":""},"PeriodicalIF":4.6,"publicationDate":"2024-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140845207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-29DOI: 10.1007/s10712-024-09840-0
Feng Cheng
This paper delivers an in-depth bibliometric analysis of distributed acoustic sensing (DAS) research within the realm of geophysics, covering the period from 2012 to 2023 and drawing on data from the Web of Science. By employing bibliographic and structured network analysis methods, including the use of Bibliometrix and VOSviewer®, the study highlights the most influential scholars, leading institutions, and pivotal research contributions that have significantly shaped the field of DAS in geophysics. The research delves into key collaborative dynamics, unraveling them through co-authorship network analysis, and delves into thematic developments and trajectories via comprehensive co-citation and keyword co-occurrence network analyses. These analyses elucidate the most robust and prominent areas within DAS research. A critical insight gained from this study is the rise of ‘photonic seismology’ as an emerging interdisciplinary domain, exemplifying the fusion of photonic sensing techniques with seismic science. This paper also discusses certain limitations inherent in the study and concludes with implications for future research.
本文对地球物理学领域的分布式声学传感(DAS)研究进行了深入的文献计量分析,研究时间跨度为 2012 年至 2023 年,数据来源于 Web of Science。通过采用书目和结构化网络分析方法(包括使用 Bibliometrix 和 VOSviewer®),该研究突出了对地球物理学中的分布式声学传感(DAS)领域产生重大影响的最有影响力的学者、领先机构和关键研究成果。该研究深入探讨了关键的合作动态,通过共同作者网络分析揭示了这些动态,并通过全面的共同引用和关键词共现网络分析深入探讨了专题发展和轨迹。这些分析阐明了 DAS 研究中最活跃、最突出的领域。从这项研究中获得的一个重要启示是 "光子地震学 "作为一个新兴的跨学科领域的崛起,体现了光子传感技术与地震科学的融合。本文还讨论了研究中固有的某些局限性,最后提出了对未来研究的启示。
{"title":"Photonic Seismology: A New Decade of Distributed Acoustic Sensing in Geophysics from 2012 to 2023","authors":"Feng Cheng","doi":"10.1007/s10712-024-09840-0","DOIUrl":"10.1007/s10712-024-09840-0","url":null,"abstract":"<div><p>This paper delivers an in-depth bibliometric analysis of distributed acoustic sensing (DAS) research within the realm of geophysics, covering the period from 2012 to 2023 and drawing on data from the Web of Science. By employing bibliographic and structured network analysis methods, including the use of Bibliometrix and VOSviewer<sup>®</sup>, the study highlights the most influential scholars, leading institutions, and pivotal research contributions that have significantly shaped the field of DAS in geophysics. The research delves into key collaborative dynamics, unraveling them through co-authorship network analysis, and delves into thematic developments and trajectories via comprehensive co-citation and keyword co-occurrence network analyses. These analyses elucidate the most robust and prominent areas within DAS research. A critical insight gained from this study is the rise of ‘photonic seismology’ as an emerging interdisciplinary domain, exemplifying the fusion of photonic sensing techniques with seismic science. This paper also discusses certain limitations inherent in the study and concludes with implications for future research.</p></div>","PeriodicalId":49458,"journal":{"name":"Surveys in Geophysics","volume":"45 4","pages":"1205 - 1243"},"PeriodicalIF":4.9,"publicationDate":"2024-04-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140814771","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}