Antonio M. Burgueño-Romero, Cristóbal Barba-González, José F. Aldana-Montes
{"title":"Big Data-driven MLOps workflow for annual high-resolution land cover classification models","authors":"Antonio M. Burgueño-Romero, Cristóbal Barba-González, José F. Aldana-Montes","doi":"10.1016/j.future.2024.107499","DOIUrl":null,"url":null,"abstract":"<div><p>Developing an annual and global high-resolution land cover map is one of the most ambitious tasks in remote sensing, with increasing importance due to the continual rise in validated data and satellite imagery. The success of land cover classification models largely hinges on the data quality, coupled with the application of Big Data techniques and distributed computing. This is essential for efficiently processing the extensive volume of available satellite data. However, maintaining the lifecycle of several annual Machine Learning models presents a complex challenge. The rise of Machine Learning Operations offers an opportunity to automate the maintenance of these models, a feature particularly beneficial in systems that require generating new models each year alongside the continuous integration of validated data. This article details the development of an end-to-end MLOps workflow, meticulously integrating land cover classification models that employ Big Data strategies for processing large-scale, high-resolution spatial data. The workflow is designed within a Kubernetes environment, achieving on-demand auto-scaling, distributed computing, and load balancing. This integration demonstrates the practicality and efficiency of managing and deploying models that treat satellite imagery in an automated, scalable framework, thus marking a significant advancement in remote sensing and MLOps.</p></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"163 ","pages":"Article 107499"},"PeriodicalIF":6.2000,"publicationDate":"2024-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0167739X24004631/pdfft?md5=523bb92402be8f64ea38e47ead45e895&pid=1-s2.0-S0167739X24004631-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Future Generation Computer Systems-The International Journal of Escience","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167739X24004631","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Developing an annual and global high-resolution land cover map is one of the most ambitious tasks in remote sensing, with increasing importance due to the continual rise in validated data and satellite imagery. The success of land cover classification models largely hinges on the data quality, coupled with the application of Big Data techniques and distributed computing. This is essential for efficiently processing the extensive volume of available satellite data. However, maintaining the lifecycle of several annual Machine Learning models presents a complex challenge. The rise of Machine Learning Operations offers an opportunity to automate the maintenance of these models, a feature particularly beneficial in systems that require generating new models each year alongside the continuous integration of validated data. This article details the development of an end-to-end MLOps workflow, meticulously integrating land cover classification models that employ Big Data strategies for processing large-scale, high-resolution spatial data. The workflow is designed within a Kubernetes environment, achieving on-demand auto-scaling, distributed computing, and load balancing. This integration demonstrates the practicality and efficiency of managing and deploying models that treat satellite imagery in an automated, scalable framework, thus marking a significant advancement in remote sensing and MLOps.
期刊介绍:
Computing infrastructures and systems are constantly evolving, resulting in increasingly complex and collaborative scientific applications. To cope with these advancements, there is a growing need for collaborative tools that can effectively map, control, and execute these applications.
Furthermore, with the explosion of Big Data, there is a requirement for innovative methods and infrastructures to collect, analyze, and derive meaningful insights from the vast amount of data generated. This necessitates the integration of computational and storage capabilities, databases, sensors, and human collaboration.
Future Generation Computer Systems aims to pioneer advancements in distributed systems, collaborative environments, high-performance computing, and Big Data analytics. It strives to stay at the forefront of developments in grids, clouds, and the Internet of Things (IoT) to effectively address the challenges posed by these wide-area, fully distributed sensing and computing systems.