Climate change and population growth intensify the demand for precise agriculture mapping to enhance food security. Such mapping tasks require robust modeling of multi-scale spatiotemporal patterns from fine field textures to landscape context, and from short-term phenology to full growing-season dynamics. Existing methods often process spatial and temporal features separately, limiting their ability to capture essential agricultural dynamics. While transformer-based remote sensing foundation models (RSFMs) offer unified spatiotemporal modeling ability, most of them remain suboptimal: they either use fixed windows that ignore multi-scale crop characteristics or neglect temporal information entirely. To address these gaps, we propose AgriFM, a multi-source, multi-temporal foundation model for agriculture mapping. AgriFM introduces a synchronized spatiotemporal downsampling strategy within a Video Swin Transformer backbone, enabling efficient handling of long and variable-length satellite time series while preserving multi-scale spatial and phenological information. It is pre-trained on a globally representative dataset comprising over 25 million samples from MODIS, Landsat-8/9, and Sentinel-2 with land cover fractions as pre-training supervision. AgriFM further integrates a versatile decoder specifically designed to dynamically fuse multi-source features from different stages of backbone and accommodate varying temporal lengths, thereby supporting consistent and scalable agriculture mapping across diverse satellite sources and task requirements. It supports diverse tasks including agricultural land mapping, field boundary delineation, agricultural land use/land cover mapping, and specific crop mapping (e.g., winter wheat and paddy rice) with different data sources. Comprehensive evaluations show that AgriFM consistently outperforms existing deep learning models and general-purpose RSFMs across multiple agriculture mapping tasks. Codes and models are available at https://github.com/flyakon/AgriFM and https://glass.hku.hk
扫码关注我们
求助内容:
应助结果提醒方式:
