利用加拿大全局确定性分析对 37 级 GraphCast 进行高效微调

Christopher Subich
{"title":"利用加拿大全局确定性分析对 37 级 GraphCast 进行高效微调","authors":"Christopher Subich","doi":"arxiv-2408.14587","DOIUrl":null,"url":null,"abstract":"This work describes a process for efficiently fine-tuning the GraphCast\ndata-driven forecast model to simulate another analysis system, here the Global\nDeterministic Prediction System (GDPS) of Environment and Climate Change Canada\n(ECCC). Using two years of training data (July 2019 -- December 2021) and 37\nGPU-days of computation to tune the 37-level, quarter-degree version of\nGraphCast, the resulting model significantly outperforms both the unmodified\nGraphCast and operational forecast, showing significant forecast skill in the\ntroposphere over lead times from 1 to 10 days. This fine-tuning is accomplished\nthrough abbreviating DeepMind's original training curriculum for GraphCast,\nrelying on a shorter single-step forecast stage to accomplish the bulk of the\nadaptation work and consolidating the autoregressive stages into separate 12hr,\n1d, 2d, and 3d stages with larger learning rates. Additionally, training over\n3d forecasts is split into two sub-steps to conserve host memory while\nmaintaining a strong correlation with training over the full period.","PeriodicalId":501166,"journal":{"name":"arXiv - PHYS - Atmospheric and Oceanic Physics","volume":"18 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Efficient fine-tuning of 37-level GraphCast with the Canadian global deterministic analysis\",\"authors\":\"Christopher Subich\",\"doi\":\"arxiv-2408.14587\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This work describes a process for efficiently fine-tuning the GraphCast\\ndata-driven forecast model to simulate another analysis system, here the Global\\nDeterministic Prediction System (GDPS) of Environment and Climate Change Canada\\n(ECCC). Using two years of training data (July 2019 -- December 2021) and 37\\nGPU-days of computation to tune the 37-level, quarter-degree version of\\nGraphCast, the resulting model significantly outperforms both the unmodified\\nGraphCast and operational forecast, showing significant forecast skill in the\\ntroposphere over lead times from 1 to 10 days. This fine-tuning is accomplished\\nthrough abbreviating DeepMind's original training curriculum for GraphCast,\\nrelying on a shorter single-step forecast stage to accomplish the bulk of the\\nadaptation work and consolidating the autoregressive stages into separate 12hr,\\n1d, 2d, and 3d stages with larger learning rates. Additionally, training over\\n3d forecasts is split into two sub-steps to conserve host memory while\\nmaintaining a strong correlation with training over the full period.\",\"PeriodicalId\":501166,\"journal\":{\"name\":\"arXiv - PHYS - Atmospheric and Oceanic Physics\",\"volume\":\"18 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - PHYS - Atmospheric and Oceanic Physics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2408.14587\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Atmospheric and Oceanic Physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.14587","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

这项工作描述了一个有效微调GraphCast数据驱动预报模型的过程,以模拟另一个分析系统,这里是加拿大环境与气候变化部(ECCC)的全球确定性预报系统(GDPS)。利用两年的训练数据(2019年7月至2021年12月)和37GPU天的计算来调整37级四分之一度版本的GraphCast,结果模型明显优于未修改的GraphCast和业务预报,在1至10天的准备时间内显示出显著的对流层预报技能。这种微调是通过缩减 DeepMind 最初的 GraphCast 训练课程来实现的,依靠更短的单步预测阶段来完成大部分适应工作,并将自回归阶段合并为具有更大学习率的 12 小时、1 天、2 天和 3 天独立阶段。此外,3d 预测的训练被分成两个子步骤,以节省主机内存,同时与整个时段的训练保持较强的相关性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Efficient fine-tuning of 37-level GraphCast with the Canadian global deterministic analysis
This work describes a process for efficiently fine-tuning the GraphCast data-driven forecast model to simulate another analysis system, here the Global Deterministic Prediction System (GDPS) of Environment and Climate Change Canada (ECCC). Using two years of training data (July 2019 -- December 2021) and 37 GPU-days of computation to tune the 37-level, quarter-degree version of GraphCast, the resulting model significantly outperforms both the unmodified GraphCast and operational forecast, showing significant forecast skill in the troposphere over lead times from 1 to 10 days. This fine-tuning is accomplished through abbreviating DeepMind's original training curriculum for GraphCast, relying on a shorter single-step forecast stage to accomplish the bulk of the adaptation work and consolidating the autoregressive stages into separate 12hr, 1d, 2d, and 3d stages with larger learning rates. Additionally, training over 3d forecasts is split into two sub-steps to conserve host memory while maintaining a strong correlation with training over the full period.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Harnessing AI data-driven global weather models for climate attribution: An analysis of the 2017 Oroville Dam extreme atmospheric river Super Resolution On Global Weather Forecasts Can Transfer Learning be Used to Identify Tropical State-Dependent Bias Relevant to Midlatitude Subseasonal Predictability? Using Generative Models to Produce Realistic Populations of the United Kingdom Windstorms Integrated nowcasting of convective precipitation with Transformer-based models using multi-source data
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1