{"title":"Efficient fine-tuning of 37-level GraphCast with the Canadian global deterministic analysis","authors":"Christopher Subich","doi":"arxiv-2408.14587","DOIUrl":null,"url":null,"abstract":"This work describes a process for efficiently fine-tuning the GraphCast\ndata-driven forecast model to simulate another analysis system, here the Global\nDeterministic Prediction System (GDPS) of Environment and Climate Change Canada\n(ECCC). Using two years of training data (July 2019 -- December 2021) and 37\nGPU-days of computation to tune the 37-level, quarter-degree version of\nGraphCast, the resulting model significantly outperforms both the unmodified\nGraphCast and operational forecast, showing significant forecast skill in the\ntroposphere over lead times from 1 to 10 days. This fine-tuning is accomplished\nthrough abbreviating DeepMind's original training curriculum for GraphCast,\nrelying on a shorter single-step forecast stage to accomplish the bulk of the\nadaptation work and consolidating the autoregressive stages into separate 12hr,\n1d, 2d, and 3d stages with larger learning rates. Additionally, training over\n3d forecasts is split into two sub-steps to conserve host memory while\nmaintaining a strong correlation with training over the full period.","PeriodicalId":501166,"journal":{"name":"arXiv - PHYS - Atmospheric and Oceanic Physics","volume":"18 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Atmospheric and Oceanic Physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.14587","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This work describes a process for efficiently fine-tuning the GraphCast
data-driven forecast model to simulate another analysis system, here the Global
Deterministic Prediction System (GDPS) of Environment and Climate Change Canada
(ECCC). Using two years of training data (July 2019 -- December 2021) and 37
GPU-days of computation to tune the 37-level, quarter-degree version of
GraphCast, the resulting model significantly outperforms both the unmodified
GraphCast and operational forecast, showing significant forecast skill in the
troposphere over lead times from 1 to 10 days. This fine-tuning is accomplished
through abbreviating DeepMind's original training curriculum for GraphCast,
relying on a shorter single-step forecast stage to accomplish the bulk of the
adaptation work and consolidating the autoregressive stages into separate 12hr,
1d, 2d, and 3d stages with larger learning rates. Additionally, training over
3d forecasts is split into two sub-steps to conserve host memory while
maintaining a strong correlation with training over the full period.