Snow cover is among the most important features of the Earth's surface and a crucial element of the cryosphere that affects the global energy balance, water, and carbon cycles. Accurate monitoring of this land surface component is of particular significance as snowmelt provides between 50%–80% of the annual runoff in the temperate (boreal) regions and significantly impacts the hydrological balance during the warm season. Limited reserves of soil moisture during the winter period can lead to all types of droughts, including green-water drought, which is expressed by reduced water storage in soil and vegetation. Green-water drought causes a variable effect across landscape components, on the functions and ecosystem services (ES) they provide. The present study aims to track the snow cover dynamics in the transitional seasons of the year when the snow cover is most unstable and to differentiate its territorial distribution depending on elevation and slope exposure. The study area covers the mountainous territories of Bulgaria and the seasons from 2016 to 2022. To achieve the aim of the study, we used Sentinel-2 images and calculated the Snow Water Index (SWI). SWI uses spectral characteristics of the visible, shortwave infrared (SWIR), and near-infrared (NIR) bands to distinguish snow and ice pixels from other pixels, including water bodies which is crucial for the accurate monitoring of snow cover dynamics. The obtained results were validated using VHR images for pre-selected test areas.
{"title":"Assessment of elevation and slope exposure impact on snow cover distribution in the mountainous region in Bulgaria using Sentinel-2 satellite data","authors":"Daniela Avetisyan, Andrey Stoyanov","doi":"10.1117/12.2679770","DOIUrl":"https://doi.org/10.1117/12.2679770","url":null,"abstract":"Snow cover is among the most important features of the Earth's surface and a crucial element of the cryosphere that affects the global energy balance, water, and carbon cycles. Accurate monitoring of this land surface component is of particular significance as snowmelt provides between 50%–80% of the annual runoff in the temperate (boreal) regions and significantly impacts the hydrological balance during the warm season. Limited reserves of soil moisture during the winter period can lead to all types of droughts, including green-water drought, which is expressed by reduced water storage in soil and vegetation. Green-water drought causes a variable effect across landscape components, on the functions and ecosystem services (ES) they provide. The present study aims to track the snow cover dynamics in the transitional seasons of the year when the snow cover is most unstable and to differentiate its territorial distribution depending on elevation and slope exposure. The study area covers the mountainous territories of Bulgaria and the seasons from 2016 to 2022. To achieve the aim of the study, we used Sentinel-2 images and calculated the Snow Water Index (SWI). SWI uses spectral characteristics of the visible, shortwave infrared (SWIR), and near-infrared (NIR) bands to distinguish snow and ice pixels from other pixels, including water bodies which is crucial for the accurate monitoring of snow cover dynamics. The obtained results were validated using VHR images for pre-selected test areas.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130894589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E. Psomiadis, S. Alexandris, N. Proutsos, Ioannis Charalampopoulos
Precision agriculture has been at the cutting edge of research during the recent decade, aiming to reduce water consumption and ensure sustainability in agriculture. The present study aims to estimate the actual water requirements of crop fields based on the Crop Water Stress Index, combining multiple and multiscale data, such as infrared canopy temperature, air temperature, air relative humidity, near-infrared and thermal infrared image data, taken above the crop field using an innovative aerial micrometeorological station (AMMS), and two more compatible and advanced cameras, a multispectral and a thermal mounted in an Unmanned Aerial Vehicle (UAV), along with satellite-derived thermal data. Moreover, ground micrometeorological stations (GMMS) were installed in each crop. The study area was situated in Trifilia (Peloponnese, Greece) and the experimentation was conducted on two different crops, potato, and watermelon, which are representative cultivations of the area. The analysis of the results showed, in the case of the potato field, that the amount of irrigation water supplied in the rhizosphere far exceeds the maximum crop needs reaching values of about 394% more water than the maximum required amount needed by the crop. Finally, the correlation of the different remote and proximal sensors proved to be sufficiently high while the correlation with the satellite data was moderate. The overall conclusion of this research is that proper irrigation water management is extremely necessary and the only solution for agricultural sustainability in the future. The increasing demand for freshwater, mainly for irrigation purposes, will inevitably lead to groundwater overexploitation and deterioration of the area's already affected and semi-brackish coastal aquifers.
{"title":"Coupling multiscale remote and proximal sensors for the estimation of crop water requirements","authors":"E. Psomiadis, S. Alexandris, N. Proutsos, Ioannis Charalampopoulos","doi":"10.1117/12.2680125","DOIUrl":"https://doi.org/10.1117/12.2680125","url":null,"abstract":"Precision agriculture has been at the cutting edge of research during the recent decade, aiming to reduce water consumption and ensure sustainability in agriculture. The present study aims to estimate the actual water requirements of crop fields based on the Crop Water Stress Index, combining multiple and multiscale data, such as infrared canopy temperature, air temperature, air relative humidity, near-infrared and thermal infrared image data, taken above the crop field using an innovative aerial micrometeorological station (AMMS), and two more compatible and advanced cameras, a multispectral and a thermal mounted in an Unmanned Aerial Vehicle (UAV), along with satellite-derived thermal data. Moreover, ground micrometeorological stations (GMMS) were installed in each crop. The study area was situated in Trifilia (Peloponnese, Greece) and the experimentation was conducted on two different crops, potato, and watermelon, which are representative cultivations of the area. The analysis of the results showed, in the case of the potato field, that the amount of irrigation water supplied in the rhizosphere far exceeds the maximum crop needs reaching values of about 394% more water than the maximum required amount needed by the crop. Finally, the correlation of the different remote and proximal sensors proved to be sufficiently high while the correlation with the satellite data was moderate. The overall conclusion of this research is that proper irrigation water management is extremely necessary and the only solution for agricultural sustainability in the future. The increasing demand for freshwater, mainly for irrigation purposes, will inevitably lead to groundwater overexploitation and deterioration of the area's already affected and semi-brackish coastal aquifers.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114217697","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nowadays, modern technologies are more and more accessible, cheaper and easier to use. Part of them are the multisensor remotely controlled aerial platforms or the so-called “drones”. This paper presents the results of a research for the possibilities of mixing and processing multi-spectral and thermal images from aerial drone sensors for the intelligent agriculture support. Time series of multi-spectral images from the DJI Phantom 4 Multispectral and thermal images from the DJI Mavic 2 Enterprise Advanced over same area are processed. An approach of a new methodology for data fusion of drone multispectral and thermal (LWIR) images data processing for intelligent agriculture support is proposed.
{"title":"Drone multispectral and thermal images data processing for intelligent agriculture support","authors":"Miroslav Y. Tsvetkov","doi":"10.1117/12.2679928","DOIUrl":"https://doi.org/10.1117/12.2679928","url":null,"abstract":"Nowadays, modern technologies are more and more accessible, cheaper and easier to use. Part of them are the multisensor remotely controlled aerial platforms or the so-called “drones”. This paper presents the results of a research for the possibilities of mixing and processing multi-spectral and thermal images from aerial drone sensors for the intelligent agriculture support. Time series of multi-spectral images from the DJI Phantom 4 Multispectral and thermal images from the DJI Mavic 2 Enterprise Advanced over same area are processed. An approach of a new methodology for data fusion of drone multispectral and thermal (LWIR) images data processing for intelligent agriculture support is proposed.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133649880","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Miljana Marković, Branislav Živaljević, G. Mimić, Sean Woznicki, Oskar Marko, P. Lugonja
Information on crop harvest events has become valuable input for models related to food security and agricultural management and optimization. Precise large scale harvest detection depends on temporal resolution and satellite images availability. Synthetic Aperture Radar (SAR) data are more suitable than optical, since the images are not affected by clouds. This study compares two methods for harvest detection of soybean in Vojvodina province (Serbia), using the C-band of Sentinel-1. The first method represents a maximum difference of ascending VH polarization backscatter (σVH) between consecutive dates of observation. The second method uses a Radar Vegetation Index (RVI) threshold value of 0.39, optimized to minimize Mean Absolute Error (MAE). The training data consisted of 50 m point buffers’ mean value with ground-truth harvest dates (n=100) from the 2018 and 2019 growing seasons. The first method showed better performance with Pearson correlation coefficient r=0.85 and MAE=5 days, whereas the calculated metrics for the RVI threshold method were r=0.69 and MAE=8 days. Therefore, validation was performed only for the method of maximum VH backscatter difference where mean values of parcels with ground-truth harvest dates for 2020 had generated the validation dataset (n=67). Performance metrics (r=0.83 and MAE=3 days) confirmed the suitability for accurate harvest detection. Ultimately, a soybean harvest map was generated on a parcel level for Vojvodina province.
{"title":"Using Sentinel-1 data for soybean harvest detection in Vojvodina province, Serbia","authors":"Miljana Marković, Branislav Živaljević, G. Mimić, Sean Woznicki, Oskar Marko, P. Lugonja","doi":"10.1117/12.2679417","DOIUrl":"https://doi.org/10.1117/12.2679417","url":null,"abstract":"Information on crop harvest events has become valuable input for models related to food security and agricultural management and optimization. Precise large scale harvest detection depends on temporal resolution and satellite images availability. Synthetic Aperture Radar (SAR) data are more suitable than optical, since the images are not affected by clouds. This study compares two methods for harvest detection of soybean in Vojvodina province (Serbia), using the C-band of Sentinel-1. The first method represents a maximum difference of ascending VH polarization backscatter (σVH) between consecutive dates of observation. The second method uses a Radar Vegetation Index (RVI) threshold value of 0.39, optimized to minimize Mean Absolute Error (MAE). The training data consisted of 50 m point buffers’ mean value with ground-truth harvest dates (n=100) from the 2018 and 2019 growing seasons. The first method showed better performance with Pearson correlation coefficient r=0.85 and MAE=5 days, whereas the calculated metrics for the RVI threshold method were r=0.69 and MAE=8 days. Therefore, validation was performed only for the method of maximum VH backscatter difference where mean values of parcels with ground-truth harvest dates for 2020 had generated the validation dataset (n=67). Performance metrics (r=0.83 and MAE=3 days) confirmed the suitability for accurate harvest detection. Ultimately, a soybean harvest map was generated on a parcel level for Vojvodina province.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133972755","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
With recent advancements in water-quality analytical technology and the increasing popularity of the Internet of Things (IoT), the market demand for compact and durable automated water-quality monitoring devices has grown substantially. However, the current existing online monitoring devices mostly featuring a single-light source to monitor turbidity and chemical oxygen demand (COD), two critical indicators of natural water bodies, which tend to be influenced by interfering substances that limits their ability to measure more complex water-quality parameters. To address these issues, a new modularized water-quality monitoring device equipped with multi-light sources (UV/VIS/NIR) has been designed and implemented. This device can measure the photo-intensity of scattering, transmission, as well as reference light simultaneously, and coupled with different water-quality prediction models to provide accurate estimates for tap water (Turbidity<2 NTU, relative error < 17.8%), environmental sample (Turbidity<400 NTU, relative error < 2.3%) and industrial wastewater (COD<300 mg/L, relative error < 17.6%). The study results suggest the new designed optical module can effectively monitor turbidity and COD in different water samples and provide alerts for water treatment in high concentration, thereby enabling automated water quality monitoring in future.
{"title":"Design and development of an innovative online modular device for both water and wastewater monitoring","authors":"Chen-Hua Chu, Yu-Xuan Lin, Chun-Kuo Liu","doi":"10.1117/12.2683680","DOIUrl":"https://doi.org/10.1117/12.2683680","url":null,"abstract":"With recent advancements in water-quality analytical technology and the increasing popularity of the Internet of Things (IoT), the market demand for compact and durable automated water-quality monitoring devices has grown substantially. However, the current existing online monitoring devices mostly featuring a single-light source to monitor turbidity and chemical oxygen demand (COD), two critical indicators of natural water bodies, which tend to be influenced by interfering substances that limits their ability to measure more complex water-quality parameters. To address these issues, a new modularized water-quality monitoring device equipped with multi-light sources (UV/VIS/NIR) has been designed and implemented. This device can measure the photo-intensity of scattering, transmission, as well as reference light simultaneously, and coupled with different water-quality prediction models to provide accurate estimates for tap water (Turbidity<2 NTU, relative error < 17.8%), environmental sample (Turbidity<400 NTU, relative error < 2.3%) and industrial wastewater (COD<300 mg/L, relative error < 17.6%). The study results suggest the new designed optical module can effectively monitor turbidity and COD in different water samples and provide alerts for water treatment in high concentration, thereby enabling automated water quality monitoring in future.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"259 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122712231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Massimo Rippa, Andrea Pasqualini, Pasquale Mormile, Catello Pane
Infrared imaging is a well-known non-invasive technology that in recent years has gained great interest in precision agriculture field. Plants are subjected to a wide range of biotic stresses caused by pathogenic bacteria, fungi, nematodes, and viruses that reduces productivity. In this work wild rocket (Diplotaxis tenuifolia) plants inoculated with the soil-borne pathogens Rhizoctonia solani Kühn, Sclerotinia sclerotiorum (Lib.) and Fusarium oxysporum f. sp. raphani were monitored daily in laboratory by means of the infrared imaging. Plant monitoring was performed with both active and passive approaches. The results obtained showed that the infrared imaging methods tested are promising for early diagnosis of soil-borne diseases by allowing their detection a few days before they are detectable through a visual analysis. These findings open up the possibility of developing new imaging systems for both proximal and remote sensing.
{"title":"Infrared imaging for proximal and remote detection of soil-borne diseases on wild rocket","authors":"Massimo Rippa, Andrea Pasqualini, Pasquale Mormile, Catello Pane","doi":"10.1117/12.2679125","DOIUrl":"https://doi.org/10.1117/12.2679125","url":null,"abstract":"Infrared imaging is a well-known non-invasive technology that in recent years has gained great interest in precision agriculture field. Plants are subjected to a wide range of biotic stresses caused by pathogenic bacteria, fungi, nematodes, and viruses that reduces productivity. In this work wild rocket (Diplotaxis tenuifolia) plants inoculated with the soil-borne pathogens Rhizoctonia solani Kühn, Sclerotinia sclerotiorum (Lib.) and Fusarium oxysporum f. sp. raphani were monitored daily in laboratory by means of the infrared imaging. Plant monitoring was performed with both active and passive approaches. The results obtained showed that the infrared imaging methods tested are promising for early diagnosis of soil-borne diseases by allowing their detection a few days before they are detectable through a visual analysis. These findings open up the possibility of developing new imaging systems for both proximal and remote sensing.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117303414","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The aim of the study is to present results derived from monitoring the flood event on river Stryama, located in Karlovo municipality, Bulgaria occurred on 02.09.2022 due to extreme intensive rainfalls. During the flood event the rainfall had increased up to 250 l/m2, and the water level of the Stryama River in some regions had reached up to 3 meters in 8 hours. Stryama river is situated in central Bulgaria, Plovdiv district, it springs up from Stara planina mountain and its length is 110 km. The applied methodology in the following survey includes use of Sentinel-2 MSI optical data and Tasseled Cap Transformation (TCT) of selected satellite imagery for change detection and estimating the territorial extent of areas affected by the flood waters. Satellite imagery of different temporal points were chosen before and after the flood event in order to track the water dynamics around the riverbed. The calculation of the spatial and temporal characteristics of the river waters were accomplished by segmenting of Sentinel-2 multispectral imagery. The application of the matrix for Tasseled Cap Transformation segments the optical images in 3 components: TCT-brightness, TCT-wetness, TCT-greenness. On the basis of TCT-wetness component and its values the dynamics and territorial distribution of river waters were monitored for the chosen temporal period. On the basis of TCT-greenness component and Normalized Differential Greenness Index (NDGI) an assessment of the impact of flood waters on the vegetated areas was made.
{"title":"Monitoring and spatial-temporal analysis of Stryama river flood event, Karlovo municipality, Bulgaria, occurred on 02.09.2022 by the methods of remote sensing","authors":"Andrey Stoyanov","doi":"10.1117/12.2684415","DOIUrl":"https://doi.org/10.1117/12.2684415","url":null,"abstract":"The aim of the study is to present results derived from monitoring the flood event on river Stryama, located in Karlovo municipality, Bulgaria occurred on 02.09.2022 due to extreme intensive rainfalls. During the flood event the rainfall had increased up to 250 l/m2, and the water level of the Stryama River in some regions had reached up to 3 meters in 8 hours. Stryama river is situated in central Bulgaria, Plovdiv district, it springs up from Stara planina mountain and its length is 110 km. The applied methodology in the following survey includes use of Sentinel-2 MSI optical data and Tasseled Cap Transformation (TCT) of selected satellite imagery for change detection and estimating the territorial extent of areas affected by the flood waters. Satellite imagery of different temporal points were chosen before and after the flood event in order to track the water dynamics around the riverbed. The calculation of the spatial and temporal characteristics of the river waters were accomplished by segmenting of Sentinel-2 multispectral imagery. The application of the matrix for Tasseled Cap Transformation segments the optical images in 3 components: TCT-brightness, TCT-wetness, TCT-greenness. On the basis of TCT-wetness component and its values the dynamics and territorial distribution of river waters were monitored for the chosen temporal period. On the basis of TCT-greenness component and Normalized Differential Greenness Index (NDGI) an assessment of the impact of flood waters on the vegetated areas was made.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130080749","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Intelligent agriculture increasingly relies on modern technologies for reliable monitoring of crops and timely detection of areas in which special intervention is needed to ensure the planned yields. This paper represents the results of a study of the possibilities of fusion and processing of multi-spectral and thermal data from satellite systems and such from remotely controlled platforms (drones) for the decision making support in intelligent agriculture. The study examined images with different resolutions, such as from the Sentinel-2, Landsat and Planet Labs satellite systems, as well as multi-spectral images from the commercial drones DJI Phantom 4 Multispectral and thermal images (thermograms) from the DJI Mavic 2 Enterprise Advanced.
{"title":"Satellite and drone multispectral and thermal images data fusion for intelligent agriculture monitoring and decision making support","authors":"Miroslav Y. Tsvetkov","doi":"10.1117/12.2679922","DOIUrl":"https://doi.org/10.1117/12.2679922","url":null,"abstract":"Intelligent agriculture increasingly relies on modern technologies for reliable monitoring of crops and timely detection of areas in which special intervention is needed to ensure the planned yields. This paper represents the results of a study of the possibilities of fusion and processing of multi-spectral and thermal data from satellite systems and such from remotely controlled platforms (drones) for the decision making support in intelligent agriculture. The study examined images with different resolutions, such as from the Sentinel-2, Landsat and Planet Labs satellite systems, as well as multi-spectral images from the commercial drones DJI Phantom 4 Multispectral and thermal images (thermograms) from the DJI Mavic 2 Enterprise Advanced.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116744367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The present study aims to monitor the Snow Cover Extent (SCE) of the mountainous region of Bulgaria (13905 km2), located 1000m above sea level, for eight years. Information is important for calculation of Snow Water Equivalent (SWE), hydrological runoff modeling, forecasting, and assessing flood events. Global Warming and Climate Change and their impacts, such as a constant increase in recorded high-temperature levels, frequent droughts, water scarcity in the summers, and less-snow winters, have a significant effect on agriculture, hydrology, forests, and ecology in Bulgaria. The present research uses the available cloudless optical data of Sentinel-2-MSI for snow cover monitoring concerning the decrease in snow distribution during the last decade. Sentinel-2 satellite imagery, from October to May, for the period between 2016 and 2023, was generated and exported from Google Earth Engine (GEE). Normalized Differential Snow Index (NDSI) and Snow Water Index (SWI) were calculated, and the resulting output indices rasters were post-processed and inspected additionally to obtain thresholding classifications, masking out the areas covered by shadows (topographic), water bodies, forests, etc., and snow cover area distribution. The results obtained in the study can be used and integrated for climate change observations and research at the local and regional levels.
{"title":"Application of optical data from Sentinel-2-MSI for snow cover monitoring on the territory of the mountainous region of Bulgaria","authors":"Andrey Stoyanov, Daniela Avetisyan","doi":"10.1117/12.2679774","DOIUrl":"https://doi.org/10.1117/12.2679774","url":null,"abstract":"The present study aims to monitor the Snow Cover Extent (SCE) of the mountainous region of Bulgaria (13905 km2), located 1000m above sea level, for eight years. Information is important for calculation of Snow Water Equivalent (SWE), hydrological runoff modeling, forecasting, and assessing flood events. Global Warming and Climate Change and their impacts, such as a constant increase in recorded high-temperature levels, frequent droughts, water scarcity in the summers, and less-snow winters, have a significant effect on agriculture, hydrology, forests, and ecology in Bulgaria. The present research uses the available cloudless optical data of Sentinel-2-MSI for snow cover monitoring concerning the decrease in snow distribution during the last decade. Sentinel-2 satellite imagery, from October to May, for the period between 2016 and 2023, was generated and exported from Google Earth Engine (GEE). Normalized Differential Snow Index (NDSI) and Snow Water Index (SWI) were calculated, and the resulting output indices rasters were post-processed and inspected additionally to obtain thresholding classifications, masking out the areas covered by shadows (topographic), water bodies, forests, etc., and snow cover area distribution. The results obtained in the study can be used and integrated for climate change observations and research at the local and regional levels.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"163 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125911902","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Remote sensing drone utilization technology is rapidly developing and is being used in a variety of ways, from seed sowing to disease management and maintenance. Crop growth environment analysis and crop prediction vary depending on the climate, soil environment, topography, and applied technology of the target area. Crop growth is a complex trait determined by various factors such as genotype, growing environment, and interactions. To accurately predict growth conditions and growth, it is necessary to fundamentally understand the functional relationship between these interaction factors through data analysis. Interpretation of growth-related relationships requires both a comprehensive dataset and powerful algorithms in the model. This study aimed to build a model using drone imaging and AI technology to develop a model for the cultivation status and growth prediction of various crops grown in the field. The development model included the entire process of drone image acquisition, image processing, AI algorithm application, farmland information, crop status, and growth information production. This paper presents the overall configuration for the construction of the growth prediction model and the results of the AI-based cultivation area extraction model conducted in the first stage. Classifying cultivated crops by field is important for identifying the cultivated area and predicting yield. The development of drone remote sensing (RS) and AI technology has made it possible to precisely analyze the characteristics of field crops with images. The purpose of this study was to create and evaluate an AI-based cultivated crop classification model using the reflectance and texture characteristics of drone RGB images. The major crops cultivated during the crop classification survey period were kimchi cabbage, soybean, and rice. The texture applied in this development model is the texture characteristic of Haralick using GLCM (Gray Level Co-occurrence Matrix). A total of 8 factors were used to create the model: mean, variance, contrast, homogeneity, correlation, ASM (Angular Second Moment), homogeneity, and dissimilarity. Two AI models, SVC and RFC, were built in this study. For the SVC-based classification model, the hyperparameters C and gamma were set to 1.5 and 0.01, respectively, and a radial basis function (RBF) kernel was used. The cross-validation accuracy was 0.88 and the test set accuracy was 0.91. The maximum depth of the RFC-based classification model was set to 8 and the number of trees was set to 500. The cross-validation accuracy of the RFC-based model was 0.95, and the test set accuracy was 0.89. The learning time of the two models was 90 seconds for the SVC model and 7,200 seconds for the RFC model. The SVC-based classification model was evaluated as advantageous when considering classification accuracy and learning time. The findings of this study are expected to improve the precision of crop cultivation area identification using AI techn
{"title":"Development of drone acquisition imagery and AI-based field crop status and growth prediction model","authors":"Seung-Hwan Go, Jong-Hwa Park","doi":"10.1117/12.2684522","DOIUrl":"https://doi.org/10.1117/12.2684522","url":null,"abstract":"Remote sensing drone utilization technology is rapidly developing and is being used in a variety of ways, from seed sowing to disease management and maintenance. Crop growth environment analysis and crop prediction vary depending on the climate, soil environment, topography, and applied technology of the target area. Crop growth is a complex trait determined by various factors such as genotype, growing environment, and interactions. To accurately predict growth conditions and growth, it is necessary to fundamentally understand the functional relationship between these interaction factors through data analysis. Interpretation of growth-related relationships requires both a comprehensive dataset and powerful algorithms in the model. This study aimed to build a model using drone imaging and AI technology to develop a model for the cultivation status and growth prediction of various crops grown in the field. The development model included the entire process of drone image acquisition, image processing, AI algorithm application, farmland information, crop status, and growth information production. This paper presents the overall configuration for the construction of the growth prediction model and the results of the AI-based cultivation area extraction model conducted in the first stage. Classifying cultivated crops by field is important for identifying the cultivated area and predicting yield. The development of drone remote sensing (RS) and AI technology has made it possible to precisely analyze the characteristics of field crops with images. The purpose of this study was to create and evaluate an AI-based cultivated crop classification model using the reflectance and texture characteristics of drone RGB images. The major crops cultivated during the crop classification survey period were kimchi cabbage, soybean, and rice. The texture applied in this development model is the texture characteristic of Haralick using GLCM (Gray Level Co-occurrence Matrix). A total of 8 factors were used to create the model: mean, variance, contrast, homogeneity, correlation, ASM (Angular Second Moment), homogeneity, and dissimilarity. Two AI models, SVC and RFC, were built in this study. For the SVC-based classification model, the hyperparameters C and gamma were set to 1.5 and 0.01, respectively, and a radial basis function (RBF) kernel was used. The cross-validation accuracy was 0.88 and the test set accuracy was 0.91. The maximum depth of the RFC-based classification model was set to 8 and the number of trees was set to 500. The cross-validation accuracy of the RFC-based model was 0.95, and the test set accuracy was 0.89. The learning time of the two models was 90 seconds for the SVC model and 7,200 seconds for the RFC model. The SVC-based classification model was evaluated as advantageous when considering classification accuracy and learning time. The findings of this study are expected to improve the precision of crop cultivation area identification using AI techn","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132939778","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}