首页 > 最新文献

Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV最新文献

英文 中文
Assessment of elevation and slope exposure impact on snow cover distribution in the mountainous region in Bulgaria using Sentinel-2 satellite data 利用Sentinel-2卫星数据评估保加利亚山区高坡暴露对积雪分布的影响
Pub Date : 2023-10-20 DOI: 10.1117/12.2679770
Daniela Avetisyan, Andrey Stoyanov
Snow cover is among the most important features of the Earth's surface and a crucial element of the cryosphere that affects the global energy balance, water, and carbon cycles. Accurate monitoring of this land surface component is of particular significance as snowmelt provides between 50%–80% of the annual runoff in the temperate (boreal) regions and significantly impacts the hydrological balance during the warm season. Limited reserves of soil moisture during the winter period can lead to all types of droughts, including green-water drought, which is expressed by reduced water storage in soil and vegetation. Green-water drought causes a variable effect across landscape components, on the functions and ecosystem services (ES) they provide. The present study aims to track the snow cover dynamics in the transitional seasons of the year when the snow cover is most unstable and to differentiate its territorial distribution depending on elevation and slope exposure. The study area covers the mountainous territories of Bulgaria and the seasons from 2016 to 2022. To achieve the aim of the study, we used Sentinel-2 images and calculated the Snow Water Index (SWI). SWI uses spectral characteristics of the visible, shortwave infrared (SWIR), and near-infrared (NIR) bands to distinguish snow and ice pixels from other pixels, including water bodies which is crucial for the accurate monitoring of snow cover dynamics. The obtained results were validated using VHR images for pre-selected test areas.
{"title":"Assessment of elevation and slope exposure impact on snow cover distribution in the mountainous region in Bulgaria using Sentinel-2 satellite data","authors":"Daniela Avetisyan, Andrey Stoyanov","doi":"10.1117/12.2679770","DOIUrl":"https://doi.org/10.1117/12.2679770","url":null,"abstract":"Snow cover is among the most important features of the Earth's surface and a crucial element of the cryosphere that affects the global energy balance, water, and carbon cycles. Accurate monitoring of this land surface component is of particular significance as snowmelt provides between 50%–80% of the annual runoff in the temperate (boreal) regions and significantly impacts the hydrological balance during the warm season. Limited reserves of soil moisture during the winter period can lead to all types of droughts, including green-water drought, which is expressed by reduced water storage in soil and vegetation. Green-water drought causes a variable effect across landscape components, on the functions and ecosystem services (ES) they provide. The present study aims to track the snow cover dynamics in the transitional seasons of the year when the snow cover is most unstable and to differentiate its territorial distribution depending on elevation and slope exposure. The study area covers the mountainous territories of Bulgaria and the seasons from 2016 to 2022. To achieve the aim of the study, we used Sentinel-2 images and calculated the Snow Water Index (SWI). SWI uses spectral characteristics of the visible, shortwave infrared (SWIR), and near-infrared (NIR) bands to distinguish snow and ice pixels from other pixels, including water bodies which is crucial for the accurate monitoring of snow cover dynamics. The obtained results were validated using VHR images for pre-selected test areas.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130894589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Coupling multiscale remote and proximal sensors for the estimation of crop water requirements 耦合多尺度遥感与近端传感器估算作物需水量
Pub Date : 2023-10-20 DOI: 10.1117/12.2680125
E. Psomiadis, S. Alexandris, N. Proutsos, Ioannis Charalampopoulos
Precision agriculture has been at the cutting edge of research during the recent decade, aiming to reduce water consumption and ensure sustainability in agriculture. The present study aims to estimate the actual water requirements of crop fields based on the Crop Water Stress Index, combining multiple and multiscale data, such as infrared canopy temperature, air temperature, air relative humidity, near-infrared and thermal infrared image data, taken above the crop field using an innovative aerial micrometeorological station (AMMS), and two more compatible and advanced cameras, a multispectral and a thermal mounted in an Unmanned Aerial Vehicle (UAV), along with satellite-derived thermal data. Moreover, ground micrometeorological stations (GMMS) were installed in each crop. The study area was situated in Trifilia (Peloponnese, Greece) and the experimentation was conducted on two different crops, potato, and watermelon, which are representative cultivations of the area. The analysis of the results showed, in the case of the potato field, that the amount of irrigation water supplied in the rhizosphere far exceeds the maximum crop needs reaching values of about 394% more water than the maximum required amount needed by the crop. Finally, the correlation of the different remote and proximal sensors proved to be sufficiently high while the correlation with the satellite data was moderate. The overall conclusion of this research is that proper irrigation water management is extremely necessary and the only solution for agricultural sustainability in the future. The increasing demand for freshwater, mainly for irrigation purposes, will inevitably lead to groundwater overexploitation and deterioration of the area's already affected and semi-brackish coastal aquifers.
{"title":"Coupling multiscale remote and proximal sensors for the estimation of crop water requirements","authors":"E. Psomiadis, S. Alexandris, N. Proutsos, Ioannis Charalampopoulos","doi":"10.1117/12.2680125","DOIUrl":"https://doi.org/10.1117/12.2680125","url":null,"abstract":"Precision agriculture has been at the cutting edge of research during the recent decade, aiming to reduce water consumption and ensure sustainability in agriculture. The present study aims to estimate the actual water requirements of crop fields based on the Crop Water Stress Index, combining multiple and multiscale data, such as infrared canopy temperature, air temperature, air relative humidity, near-infrared and thermal infrared image data, taken above the crop field using an innovative aerial micrometeorological station (AMMS), and two more compatible and advanced cameras, a multispectral and a thermal mounted in an Unmanned Aerial Vehicle (UAV), along with satellite-derived thermal data. Moreover, ground micrometeorological stations (GMMS) were installed in each crop. The study area was situated in Trifilia (Peloponnese, Greece) and the experimentation was conducted on two different crops, potato, and watermelon, which are representative cultivations of the area. The analysis of the results showed, in the case of the potato field, that the amount of irrigation water supplied in the rhizosphere far exceeds the maximum crop needs reaching values of about 394% more water than the maximum required amount needed by the crop. Finally, the correlation of the different remote and proximal sensors proved to be sufficiently high while the correlation with the satellite data was moderate. The overall conclusion of this research is that proper irrigation water management is extremely necessary and the only solution for agricultural sustainability in the future. The increasing demand for freshwater, mainly for irrigation purposes, will inevitably lead to groundwater overexploitation and deterioration of the area's already affected and semi-brackish coastal aquifers.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114217697","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Drone multispectral and thermal images data processing for intelligent agriculture support 无人机多光谱和热图像数据处理为智能农业提供支持
Pub Date : 2023-10-20 DOI: 10.1117/12.2679928
Miroslav Y. Tsvetkov
Nowadays, modern technologies are more and more accessible, cheaper and easier to use. Part of them are the multisensor remotely controlled aerial platforms or the so-called “drones”. This paper presents the results of a research for the possibilities of mixing and processing multi-spectral and thermal images from aerial drone sensors for the intelligent agriculture support. Time series of multi-spectral images from the DJI Phantom 4 Multispectral and thermal images from the DJI Mavic 2 Enterprise Advanced over same area are processed. An approach of a new methodology for data fusion of drone multispectral and thermal (LWIR) images data processing for intelligent agriculture support is proposed.
{"title":"Drone multispectral and thermal images data processing for intelligent agriculture support","authors":"Miroslav Y. Tsvetkov","doi":"10.1117/12.2679928","DOIUrl":"https://doi.org/10.1117/12.2679928","url":null,"abstract":"Nowadays, modern technologies are more and more accessible, cheaper and easier to use. Part of them are the multisensor remotely controlled aerial platforms or the so-called “drones”. This paper presents the results of a research for the possibilities of mixing and processing multi-spectral and thermal images from aerial drone sensors for the intelligent agriculture support. Time series of multi-spectral images from the DJI Phantom 4 Multispectral and thermal images from the DJI Mavic 2 Enterprise Advanced over same area are processed. An approach of a new methodology for data fusion of drone multispectral and thermal (LWIR) images data processing for intelligent agriculture support is proposed.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133649880","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Using Sentinel-1 data for soybean harvest detection in Vojvodina province, Serbia 利用Sentinel-1数据在塞尔维亚伏伊伏丁那省进行大豆收获检测
Pub Date : 2023-10-20 DOI: 10.1117/12.2679417
Miljana Marković, Branislav Živaljević, G. Mimić, Sean Woznicki, Oskar Marko, P. Lugonja
Information on crop harvest events has become valuable input for models related to food security and agricultural management and optimization. Precise large scale harvest detection depends on temporal resolution and satellite images availability. Synthetic Aperture Radar (SAR) data are more suitable than optical, since the images are not affected by clouds. This study compares two methods for harvest detection of soybean in Vojvodina province (Serbia), using the C-band of Sentinel-1. The first method represents a maximum difference of ascending VH polarization backscatter (σVH) between consecutive dates of observation. The second method uses a Radar Vegetation Index (RVI) threshold value of 0.39, optimized to minimize Mean Absolute Error (MAE). The training data consisted of 50 m point buffers’ mean value with ground-truth harvest dates (n=100) from the 2018 and 2019 growing seasons. The first method showed better performance with Pearson correlation coefficient r=0.85 and MAE=5 days, whereas the calculated metrics for the RVI threshold method were r=0.69 and MAE=8 days. Therefore, validation was performed only for the method of maximum VH backscatter difference where mean values of parcels with ground-truth harvest dates for 2020 had generated the validation dataset (n=67). Performance metrics (r=0.83 and MAE=3 days) confirmed the suitability for accurate harvest detection. Ultimately, a soybean harvest map was generated on a parcel level for Vojvodina province.
{"title":"Using Sentinel-1 data for soybean harvest detection in Vojvodina province, Serbia","authors":"Miljana Marković, Branislav Živaljević, G. Mimić, Sean Woznicki, Oskar Marko, P. Lugonja","doi":"10.1117/12.2679417","DOIUrl":"https://doi.org/10.1117/12.2679417","url":null,"abstract":"Information on crop harvest events has become valuable input for models related to food security and agricultural management and optimization. Precise large scale harvest detection depends on temporal resolution and satellite images availability. Synthetic Aperture Radar (SAR) data are more suitable than optical, since the images are not affected by clouds. This study compares two methods for harvest detection of soybean in Vojvodina province (Serbia), using the C-band of Sentinel-1. The first method represents a maximum difference of ascending VH polarization backscatter (σVH) between consecutive dates of observation. The second method uses a Radar Vegetation Index (RVI) threshold value of 0.39, optimized to minimize Mean Absolute Error (MAE). The training data consisted of 50 m point buffers’ mean value with ground-truth harvest dates (n=100) from the 2018 and 2019 growing seasons. The first method showed better performance with Pearson correlation coefficient r=0.85 and MAE=5 days, whereas the calculated metrics for the RVI threshold method were r=0.69 and MAE=8 days. Therefore, validation was performed only for the method of maximum VH backscatter difference where mean values of parcels with ground-truth harvest dates for 2020 had generated the validation dataset (n=67). Performance metrics (r=0.83 and MAE=3 days) confirmed the suitability for accurate harvest detection. Ultimately, a soybean harvest map was generated on a parcel level for Vojvodina province.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133972755","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Design and development of an innovative online modular device for both water and wastewater monitoring 设计和开发一种创新的在线模块化设备,用于水和废水监测
Pub Date : 2023-10-20 DOI: 10.1117/12.2683680
Chen-Hua Chu, Yu-Xuan Lin, Chun-Kuo Liu
With recent advancements in water-quality analytical technology and the increasing popularity of the Internet of Things (IoT), the market demand for compact and durable automated water-quality monitoring devices has grown substantially. However, the current existing online monitoring devices mostly featuring a single-light source to monitor turbidity and chemical oxygen demand (COD), two critical indicators of natural water bodies, which tend to be influenced by interfering substances that limits their ability to measure more complex water-quality parameters. To address these issues, a new modularized water-quality monitoring device equipped with multi-light sources (UV/VIS/NIR) has been designed and implemented. This device can measure the photo-intensity of scattering, transmission, as well as reference light simultaneously, and coupled with different water-quality prediction models to provide accurate estimates for tap water (Turbidity<2 NTU, relative error < 17.8%), environmental sample (Turbidity<400 NTU, relative error < 2.3%) and industrial wastewater (COD<300 mg/L, relative error < 17.6%). The study results suggest the new designed optical module can effectively monitor turbidity and COD in different water samples and provide alerts for water treatment in high concentration, thereby enabling automated water quality monitoring in future.
{"title":"Design and development of an innovative online modular device for both water and wastewater monitoring","authors":"Chen-Hua Chu, Yu-Xuan Lin, Chun-Kuo Liu","doi":"10.1117/12.2683680","DOIUrl":"https://doi.org/10.1117/12.2683680","url":null,"abstract":"With recent advancements in water-quality analytical technology and the increasing popularity of the Internet of Things (IoT), the market demand for compact and durable automated water-quality monitoring devices has grown substantially. However, the current existing online monitoring devices mostly featuring a single-light source to monitor turbidity and chemical oxygen demand (COD), two critical indicators of natural water bodies, which tend to be influenced by interfering substances that limits their ability to measure more complex water-quality parameters. To address these issues, a new modularized water-quality monitoring device equipped with multi-light sources (UV/VIS/NIR) has been designed and implemented. This device can measure the photo-intensity of scattering, transmission, as well as reference light simultaneously, and coupled with different water-quality prediction models to provide accurate estimates for tap water (Turbidity<2 NTU, relative error < 17.8%), environmental sample (Turbidity<400 NTU, relative error < 2.3%) and industrial wastewater (COD<300 mg/L, relative error < 17.6%). The study results suggest the new designed optical module can effectively monitor turbidity and COD in different water samples and provide alerts for water treatment in high concentration, thereby enabling automated water quality monitoring in future.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"259 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122712231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Infrared imaging for proximal and remote detection of soil-borne diseases on wild rocket 红外成像在野生火箭土传疾病近端和远端检测中的应用
Pub Date : 2023-10-20 DOI: 10.1117/12.2679125
Massimo Rippa, Andrea Pasqualini, Pasquale Mormile, Catello Pane
Infrared imaging is a well-known non-invasive technology that in recent years has gained great interest in precision agriculture field. Plants are subjected to a wide range of biotic stresses caused by pathogenic bacteria, fungi, nematodes, and viruses that reduces productivity. In this work wild rocket (Diplotaxis tenuifolia) plants inoculated with the soil-borne pathogens Rhizoctonia solani Kühn, Sclerotinia sclerotiorum (Lib.) and Fusarium oxysporum f. sp. raphani were monitored daily in laboratory by means of the infrared imaging. Plant monitoring was performed with both active and passive approaches. The results obtained showed that the infrared imaging methods tested are promising for early diagnosis of soil-borne diseases by allowing their detection a few days before they are detectable through a visual analysis. These findings open up the possibility of developing new imaging systems for both proximal and remote sensing.
{"title":"Infrared imaging for proximal and remote detection of soil-borne diseases on wild rocket","authors":"Massimo Rippa, Andrea Pasqualini, Pasquale Mormile, Catello Pane","doi":"10.1117/12.2679125","DOIUrl":"https://doi.org/10.1117/12.2679125","url":null,"abstract":"Infrared imaging is a well-known non-invasive technology that in recent years has gained great interest in precision agriculture field. Plants are subjected to a wide range of biotic stresses caused by pathogenic bacteria, fungi, nematodes, and viruses that reduces productivity. In this work wild rocket (Diplotaxis tenuifolia) plants inoculated with the soil-borne pathogens Rhizoctonia solani Kühn, Sclerotinia sclerotiorum (Lib.) and Fusarium oxysporum f. sp. raphani were monitored daily in laboratory by means of the infrared imaging. Plant monitoring was performed with both active and passive approaches. The results obtained showed that the infrared imaging methods tested are promising for early diagnosis of soil-borne diseases by allowing their detection a few days before they are detectable through a visual analysis. These findings open up the possibility of developing new imaging systems for both proximal and remote sensing.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117303414","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Monitoring and spatial-temporal analysis of Stryama river flood event, Karlovo municipality, Bulgaria, occurred on 02.09.2022 by the methods of remote sensing 2022年9月2日保加利亚Karlovo市Stryama河洪水事件遥感监测与时空分析
Pub Date : 2023-10-20 DOI: 10.1117/12.2684415
Andrey Stoyanov
The aim of the study is to present results derived from monitoring the flood event on river Stryama, located in Karlovo municipality, Bulgaria occurred on 02.09.2022 due to extreme intensive rainfalls. During the flood event the rainfall had increased up to 250 l/m2, and the water level of the Stryama River in some regions had reached up to 3 meters in 8 hours. Stryama river is situated in central Bulgaria, Plovdiv district, it springs up from Stara planina mountain and its length is 110 km. The applied methodology in the following survey includes use of Sentinel-2 MSI optical data and Tasseled Cap Transformation (TCT) of selected satellite imagery for change detection and estimating the territorial extent of areas affected by the flood waters. Satellite imagery of different temporal points were chosen before and after the flood event in order to track the water dynamics around the riverbed. The calculation of the spatial and temporal characteristics of the river waters were accomplished by segmenting of Sentinel-2 multispectral imagery. The application of the matrix for Tasseled Cap Transformation segments the optical images in 3 components: TCT-brightness, TCT-wetness, TCT-greenness. On the basis of TCT-wetness component and its values the dynamics and territorial distribution of river waters were monitored for the chosen temporal period. On the basis of TCT-greenness component and Normalized Differential Greenness Index (NDGI) an assessment of the impact of flood waters on the vegetated areas was made.
{"title":"Monitoring and spatial-temporal analysis of Stryama river flood event, Karlovo municipality, Bulgaria, occurred on 02.09.2022 by the methods of remote sensing","authors":"Andrey Stoyanov","doi":"10.1117/12.2684415","DOIUrl":"https://doi.org/10.1117/12.2684415","url":null,"abstract":"The aim of the study is to present results derived from monitoring the flood event on river Stryama, located in Karlovo municipality, Bulgaria occurred on 02.09.2022 due to extreme intensive rainfalls. During the flood event the rainfall had increased up to 250 l/m2, and the water level of the Stryama River in some regions had reached up to 3 meters in 8 hours. Stryama river is situated in central Bulgaria, Plovdiv district, it springs up from Stara planina mountain and its length is 110 km. The applied methodology in the following survey includes use of Sentinel-2 MSI optical data and Tasseled Cap Transformation (TCT) of selected satellite imagery for change detection and estimating the territorial extent of areas affected by the flood waters. Satellite imagery of different temporal points were chosen before and after the flood event in order to track the water dynamics around the riverbed. The calculation of the spatial and temporal characteristics of the river waters were accomplished by segmenting of Sentinel-2 multispectral imagery. The application of the matrix for Tasseled Cap Transformation segments the optical images in 3 components: TCT-brightness, TCT-wetness, TCT-greenness. On the basis of TCT-wetness component and its values the dynamics and territorial distribution of river waters were monitored for the chosen temporal period. On the basis of TCT-greenness component and Normalized Differential Greenness Index (NDGI) an assessment of the impact of flood waters on the vegetated areas was made.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130080749","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Satellite and drone multispectral and thermal images data fusion for intelligent agriculture monitoring and decision making support 卫星和无人机多光谱和热图像数据融合,为智能农业监测和决策提供支持
Pub Date : 2023-10-20 DOI: 10.1117/12.2679922
Miroslav Y. Tsvetkov
Intelligent agriculture increasingly relies on modern technologies for reliable monitoring of crops and timely detection of areas in which special intervention is needed to ensure the planned yields. This paper represents the results of a study of the possibilities of fusion and processing of multi-spectral and thermal data from satellite systems and such from remotely controlled platforms (drones) for the decision making support in intelligent agriculture. The study examined images with different resolutions, such as from the Sentinel-2, Landsat and Planet Labs satellite systems, as well as multi-spectral images from the commercial drones DJI Phantom 4 Multispectral and thermal images (thermograms) from the DJI Mavic 2 Enterprise Advanced.
{"title":"Satellite and drone multispectral and thermal images data fusion for intelligent agriculture monitoring and decision making support","authors":"Miroslav Y. Tsvetkov","doi":"10.1117/12.2679922","DOIUrl":"https://doi.org/10.1117/12.2679922","url":null,"abstract":"Intelligent agriculture increasingly relies on modern technologies for reliable monitoring of crops and timely detection of areas in which special intervention is needed to ensure the planned yields. This paper represents the results of a study of the possibilities of fusion and processing of multi-spectral and thermal data from satellite systems and such from remotely controlled platforms (drones) for the decision making support in intelligent agriculture. The study examined images with different resolutions, such as from the Sentinel-2, Landsat and Planet Labs satellite systems, as well as multi-spectral images from the commercial drones DJI Phantom 4 Multispectral and thermal images (thermograms) from the DJI Mavic 2 Enterprise Advanced.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116744367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Application of optical data from Sentinel-2-MSI for snow cover monitoring on the territory of the mountainous region of Bulgaria Sentinel-2-MSI光学数据在保加利亚山区积雪监测中的应用
Pub Date : 2023-10-20 DOI: 10.1117/12.2679774
Andrey Stoyanov, Daniela Avetisyan
The present study aims to monitor the Snow Cover Extent (SCE) of the mountainous region of Bulgaria (13905 km2), located 1000m above sea level, for eight years. Information is important for calculation of Snow Water Equivalent (SWE), hydrological runoff modeling, forecasting, and assessing flood events. Global Warming and Climate Change and their impacts, such as a constant increase in recorded high-temperature levels, frequent droughts, water scarcity in the summers, and less-snow winters, have a significant effect on agriculture, hydrology, forests, and ecology in Bulgaria. The present research uses the available cloudless optical data of Sentinel-2-MSI for snow cover monitoring concerning the decrease in snow distribution during the last decade. Sentinel-2 satellite imagery, from October to May, for the period between 2016 and 2023, was generated and exported from Google Earth Engine (GEE). Normalized Differential Snow Index (NDSI) and Snow Water Index (SWI) were calculated, and the resulting output indices rasters were post-processed and inspected additionally to obtain thresholding classifications, masking out the areas covered by shadows (topographic), water bodies, forests, etc., and snow cover area distribution. The results obtained in the study can be used and integrated for climate change observations and research at the local and regional levels.
{"title":"Application of optical data from Sentinel-2-MSI for snow cover monitoring on the territory of the mountainous region of Bulgaria","authors":"Andrey Stoyanov, Daniela Avetisyan","doi":"10.1117/12.2679774","DOIUrl":"https://doi.org/10.1117/12.2679774","url":null,"abstract":"The present study aims to monitor the Snow Cover Extent (SCE) of the mountainous region of Bulgaria (13905 km2), located 1000m above sea level, for eight years. Information is important for calculation of Snow Water Equivalent (SWE), hydrological runoff modeling, forecasting, and assessing flood events. Global Warming and Climate Change and their impacts, such as a constant increase in recorded high-temperature levels, frequent droughts, water scarcity in the summers, and less-snow winters, have a significant effect on agriculture, hydrology, forests, and ecology in Bulgaria. The present research uses the available cloudless optical data of Sentinel-2-MSI for snow cover monitoring concerning the decrease in snow distribution during the last decade. Sentinel-2 satellite imagery, from October to May, for the period between 2016 and 2023, was generated and exported from Google Earth Engine (GEE). Normalized Differential Snow Index (NDSI) and Snow Water Index (SWI) were calculated, and the resulting output indices rasters were post-processed and inspected additionally to obtain thresholding classifications, masking out the areas covered by shadows (topographic), water bodies, forests, etc., and snow cover area distribution. The results obtained in the study can be used and integrated for climate change observations and research at the local and regional levels.","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"163 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125911902","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Development of drone acquisition imagery and AI-based field crop status and growth prediction model 无人机采集图像与基于人工智能的大田作物状况与生长预测模型的开发
Pub Date : 2023-10-20 DOI: 10.1117/12.2684522
Seung-Hwan Go, Jong-Hwa Park
Remote sensing drone utilization technology is rapidly developing and is being used in a variety of ways, from seed sowing to disease management and maintenance. Crop growth environment analysis and crop prediction vary depending on the climate, soil environment, topography, and applied technology of the target area. Crop growth is a complex trait determined by various factors such as genotype, growing environment, and interactions. To accurately predict growth conditions and growth, it is necessary to fundamentally understand the functional relationship between these interaction factors through data analysis. Interpretation of growth-related relationships requires both a comprehensive dataset and powerful algorithms in the model. This study aimed to build a model using drone imaging and AI technology to develop a model for the cultivation status and growth prediction of various crops grown in the field. The development model included the entire process of drone image acquisition, image processing, AI algorithm application, farmland information, crop status, and growth information production. This paper presents the overall configuration for the construction of the growth prediction model and the results of the AI-based cultivation area extraction model conducted in the first stage. Classifying cultivated crops by field is important for identifying the cultivated area and predicting yield. The development of drone remote sensing (RS) and AI technology has made it possible to precisely analyze the characteristics of field crops with images. The purpose of this study was to create and evaluate an AI-based cultivated crop classification model using the reflectance and texture characteristics of drone RGB images. The major crops cultivated during the crop classification survey period were kimchi cabbage, soybean, and rice. The texture applied in this development model is the texture characteristic of Haralick using GLCM (Gray Level Co-occurrence Matrix). A total of 8 factors were used to create the model: mean, variance, contrast, homogeneity, correlation, ASM (Angular Second Moment), homogeneity, and dissimilarity. Two AI models, SVC and RFC, were built in this study. For the SVC-based classification model, the hyperparameters C and gamma were set to 1.5 and 0.01, respectively, and a radial basis function (RBF) kernel was used. The cross-validation accuracy was 0.88 and the test set accuracy was 0.91. The maximum depth of the RFC-based classification model was set to 8 and the number of trees was set to 500. The cross-validation accuracy of the RFC-based model was 0.95, and the test set accuracy was 0.89. The learning time of the two models was 90 seconds for the SVC model and 7,200 seconds for the RFC model. The SVC-based classification model was evaluated as advantageous when considering classification accuracy and learning time. The findings of this study are expected to improve the precision of crop cultivation area identification using AI techn
{"title":"Development of drone acquisition imagery and AI-based field crop status and growth prediction model","authors":"Seung-Hwan Go, Jong-Hwa Park","doi":"10.1117/12.2684522","DOIUrl":"https://doi.org/10.1117/12.2684522","url":null,"abstract":"Remote sensing drone utilization technology is rapidly developing and is being used in a variety of ways, from seed sowing to disease management and maintenance. Crop growth environment analysis and crop prediction vary depending on the climate, soil environment, topography, and applied technology of the target area. Crop growth is a complex trait determined by various factors such as genotype, growing environment, and interactions. To accurately predict growth conditions and growth, it is necessary to fundamentally understand the functional relationship between these interaction factors through data analysis. Interpretation of growth-related relationships requires both a comprehensive dataset and powerful algorithms in the model. This study aimed to build a model using drone imaging and AI technology to develop a model for the cultivation status and growth prediction of various crops grown in the field. The development model included the entire process of drone image acquisition, image processing, AI algorithm application, farmland information, crop status, and growth information production. This paper presents the overall configuration for the construction of the growth prediction model and the results of the AI-based cultivation area extraction model conducted in the first stage. Classifying cultivated crops by field is important for identifying the cultivated area and predicting yield. The development of drone remote sensing (RS) and AI technology has made it possible to precisely analyze the characteristics of field crops with images. The purpose of this study was to create and evaluate an AI-based cultivated crop classification model using the reflectance and texture characteristics of drone RGB images. The major crops cultivated during the crop classification survey period were kimchi cabbage, soybean, and rice. The texture applied in this development model is the texture characteristic of Haralick using GLCM (Gray Level Co-occurrence Matrix). A total of 8 factors were used to create the model: mean, variance, contrast, homogeneity, correlation, ASM (Angular Second Moment), homogeneity, and dissimilarity. Two AI models, SVC and RFC, were built in this study. For the SVC-based classification model, the hyperparameters C and gamma were set to 1.5 and 0.01, respectively, and a radial basis function (RBF) kernel was used. The cross-validation accuracy was 0.88 and the test set accuracy was 0.91. The maximum depth of the RFC-based classification model was set to 8 and the number of trees was set to 500. The cross-validation accuracy of the RFC-based model was 0.95, and the test set accuracy was 0.89. The learning time of the two models was 90 seconds for the SVC model and 7,200 seconds for the RFC model. The SVC-based classification model was evaluated as advantageous when considering classification accuracy and learning time. The findings of this study are expected to improve the precision of crop cultivation area identification using AI techn","PeriodicalId":222517,"journal":{"name":"Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132939778","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1