Transformer Versus LSTM: A Comparison of Deep Learning Models for Karst Spring Discharge Forecasting

Anna Pölz, A. Blaschke, J. Komma, A. Farnleitner, J. Derx
{"title":"Transformer Versus LSTM: A Comparison of Deep Learning Models for Karst Spring Discharge Forecasting","authors":"Anna Pölz, A. Blaschke, J. Komma, A. Farnleitner, J. Derx","doi":"10.1029/2022wr032602","DOIUrl":null,"url":null,"abstract":"Karst springs are essential drinking water resources, however, modeling them poses challenges due to complex subsurface flow processes. Deep learning models can capture complex relationships due to their ability to learn non‐linear patterns. This study evaluates the performance of the Transformer in forecasting spring discharges for up to 4 days. We compare it to the Long Short‐Term Memory (LSTM) Neural Network and a common baseline model on a well‐studied Austrian karst spring (LKAS2) with an extensive hourly database. We evaluated the models for two further karst springs with diverse discharge characteristics for comparing the performances based on four metrics. In the discharge‐based scenario, the Transformer performed significantly better than the LSTM for the spring with the longest response times (9% mean difference across metrics), while it performed poorer for the spring with the shortest response time (4% difference). Moreover, the Transformer better predicted the shape of the discharge during snowmelt. Both models performed well across all lead times and springs with 0.64–0.92 for the Nash–Sutcliffe efficiency and 10.8%–28.7% for the symmetric mean absolute percentage error for the LKAS2 spring. The temporal information, rainfall and electrical conductivity were the controlling input variables for the non‐discharge based scenario. The uncertainty analysis revealed that the prediction intervals are smallest in winter and autumn and highest during snowmelt. Our results thus suggest that the Transformer is a promising model to support the drinking water abstraction management, and can have advantages due to its attention mechanism particularly for longer response times.","PeriodicalId":507642,"journal":{"name":"Water Resources Research","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Water Resources Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1029/2022wr032602","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Karst springs are essential drinking water resources, however, modeling them poses challenges due to complex subsurface flow processes. Deep learning models can capture complex relationships due to their ability to learn non‐linear patterns. This study evaluates the performance of the Transformer in forecasting spring discharges for up to 4 days. We compare it to the Long Short‐Term Memory (LSTM) Neural Network and a common baseline model on a well‐studied Austrian karst spring (LKAS2) with an extensive hourly database. We evaluated the models for two further karst springs with diverse discharge characteristics for comparing the performances based on four metrics. In the discharge‐based scenario, the Transformer performed significantly better than the LSTM for the spring with the longest response times (9% mean difference across metrics), while it performed poorer for the spring with the shortest response time (4% difference). Moreover, the Transformer better predicted the shape of the discharge during snowmelt. Both models performed well across all lead times and springs with 0.64–0.92 for the Nash–Sutcliffe efficiency and 10.8%–28.7% for the symmetric mean absolute percentage error for the LKAS2 spring. The temporal information, rainfall and electrical conductivity were the controlling input variables for the non‐discharge based scenario. The uncertainty analysis revealed that the prediction intervals are smallest in winter and autumn and highest during snowmelt. Our results thus suggest that the Transformer is a promising model to support the drinking water abstraction management, and can have advantages due to its attention mechanism particularly for longer response times.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
变压器与 LSTM:用于岩溶泉水排放预测的深度学习模型比较
岩溶泉水是重要的饮用水资源,然而,由于地下流动过程复杂,建立岩溶泉水模型是一项挑战。深度学习模型能够学习非线性模式,因此能够捕捉复杂的关系。本研究评估了 Transformer 在预测长达 4 天的泉水排放方面的性能。我们将其与长短期记忆(LSTM)神经网络和一个常见的基线模型进行了比较,该模型是在奥地利喀斯特泉水(LKAS2)上建立的,具有广泛的每小时数据库。我们对另外两个具有不同排放特征的岩溶泉进行了评估,根据四个指标比较了这些模型的性能。在基于排水量的情况下,对于响应时间最长的泉水,Transformer 的性能明显优于 LSTM(各指标的平均差异为 9%),而对于响应时间最短的泉水,Transformer 的性能则较差(差异为 4%)。此外,变压器还能更好地预测融雪期的排放形状。这两个模型在所有响应时间和泉水中的表现都很好,纳什-萨特克利夫效率为 0.64-0.92,LKAS2 泉水的对称平均绝对百分比误差为 10.8%-28.7%。时间信息、降雨量和电导率是非排泄情景的控制输入变量。不确定性分析表明,预测区间在冬季和秋季最小,在融雪期最大。因此,我们的结果表明,Transformer 是一个支持饮用水取水管理的有前途的模型,由于其关注机制,特别是在较长的响应时间内,它可以发挥优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Anomalous Pressure Diffusion and Deformation in Two‐ and Three‐Dimensional Heterogeneous Fractured Media Transformer Versus LSTM: A Comparison of Deep Learning Models for Karst Spring Discharge Forecasting How to Choose Suitable Physics‐Based Models Without Tuning and System Identification for Model‐Predictive Control of Open Water Channels? Topography‐Based Particle Image Velocimetry of Braided Channel Initiation Automated Input Variable Selection for Analog Methods Using Genetic Algorithms
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1