Yuanhui Zhu, Shakthi B. Murugesan, Ivone K. Masara, Soe W. Myint, Joshua B. Fisher
Wildfires are increasing in risk and prevalence. The most destructive wildfires in decades in Australia occurred in 2019–2020. However, there is still a challenge in developing effective models to understand the likelihood of wildfire spread (susceptibility) and pre‐fire vegetation conditions. The recent launch of NASA's ECOSTRESS presents an opportunity to monitor fire dynamics with a high resolution of 70 m by measuring ecosystem stress and drought conditions preceding wildfires. We incorporated ECOSTRESS data, vegetation indices, rainfall, and topographic data as independent variables and fire events as dependent variables into machine learning algorithms applied to the historic Australian wildfires of 2019–2020. With these data, we predicted over 90% of all wildfire occurrences 1 week ahead of these wildfire events. Our models identified vegetation conditions with a 3‐week time lag before wildfire events in the fourth week and predicted the probability of wildfire occurrences in the subsequent week (fifth week). ECOSTRESS water use efficiency (WUE) consistently emerged as the leading factor in all models predicting wildfires. Results suggest that the pre‐fire vegetation was affected by wildfires in areas with WUE above 2 g C kg−1 H₂O at 95% probability level. Additionally, the ECOSTRESS evaporative stress index and topographic slope were identified as significant contributors in predicting wildfire susceptibility. These results indicate a significant potential for ECOSTRESS data to predict and analyze wildfires and emphasize the crucial role of drought conditions in wildfire events, as evident from ECOSTRESS data. Our approaches developed in this study and outcome can help policymakers, fire managers, and city planners assess, manage, prepare, and mitigate wildfires in the future.
{"title":"Examining wildfire dynamics using ECOSTRESS data with machine learning approaches: the case of South‐Eastern Australia's black summer","authors":"Yuanhui Zhu, Shakthi B. Murugesan, Ivone K. Masara, Soe W. Myint, Joshua B. Fisher","doi":"10.1002/rse2.422","DOIUrl":"https://doi.org/10.1002/rse2.422","url":null,"abstract":"Wildfires are increasing in risk and prevalence. The most destructive wildfires in decades in Australia occurred in 2019–2020. However, there is still a challenge in developing effective models to understand the likelihood of wildfire spread (susceptibility) and pre‐fire vegetation conditions. The recent launch of NASA's ECOSTRESS presents an opportunity to monitor fire dynamics with a high resolution of 70 m by measuring ecosystem stress and drought conditions preceding wildfires. We incorporated ECOSTRESS data, vegetation indices, rainfall, and topographic data as independent variables and fire events as dependent variables into machine learning algorithms applied to the historic Australian wildfires of 2019–2020. With these data, we predicted over 90% of all wildfire occurrences 1 week ahead of these wildfire events. Our models identified vegetation conditions with a 3‐week time lag before wildfire events in the fourth week and predicted the probability of wildfire occurrences in the subsequent week (fifth week). ECOSTRESS water use efficiency (WUE) consistently emerged as the leading factor in all models predicting wildfires. Results suggest that the pre‐fire vegetation was affected by wildfires in areas with WUE above 2 g C kg<jats:sup>−1</jats:sup> H₂O at 95% probability level. Additionally, the ECOSTRESS evaporative stress index and topographic slope were identified as significant contributors in predicting wildfire susceptibility. These results indicate a significant potential for ECOSTRESS data to predict and analyze wildfires and emphasize the crucial role of drought conditions in wildfire events, as evident from ECOSTRESS data. Our approaches developed in this study and outcome can help policymakers, fire managers, and city planners assess, manage, prepare, and mitigate wildfires in the future.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":null,"pages":null},"PeriodicalIF":5.5,"publicationDate":"2024-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142588850","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Florence Erbs, Mike van der Schaar, Miriam Marmontel, Marina Gaona, Emiliano Ramalho, Michel André
For many species at risk, monitoring challenges related to low visual detectability and elusive behavior limit the use of traditional visual surveys to collect critical information, hindering the development of sound conservation strategies. Passive acoustics can cost‐effectively acquire terrestrial and underwater long‐term data. However, to extract valuable information from large datasets, automatic methods need to be developed, tested and applied. Combining passive acoustics with deep learning models, we developed a method to monitor the secretive Amazonian manatee over two consecutive flooded seasons in the Brazilian Amazon floodplains. Subsequently, we investigated the vocal behavior parameters based on vocalization frequencies and temporal characteristics in the context of habitat use. A Convolutional Neural Network model successfully detected Amazonian manatee vocalizations with a 0.98 average precision on training data. Similar classification performance in terms of precision (range: 0.83–1.00) and recall (range: 0.97–1.00) was achieved for each year. Using this model, we evaluated manatee acoustic presence over a total of 226 days comprising recording periods in 2021 and 2022. Manatee vocalizations were consistently detected during both years, reaching 94% daily temporal occurrence in 2021, and up to 11 h a day with detections during peak presence. Manatee calls were characterized by a high emphasized frequency and high repetition rate, being mostly produced in rapid sequences. This vocal behavior strongly indicates an exchange between females and their calves. Combining passive acoustic monitoring with deep learning models, and extending temporal monitoring and increasing species detectability, we demonstrated that the approach can be used to identify manatee core habitats according to seasonality. The combined method represents a reliable, cost‐effective, scalable ecological monitoring technique that can be integrated into long‐term, standardized survey protocols of aquatic species. It can considerably benefit the monitoring of inaccessible regions, such as the Amazonian freshwater systems, which are facing immediate threats from increased hydropower construction.
{"title":"Amazonian manatee critical habitat revealed by artificial intelligence‐based passive acoustic techniques","authors":"Florence Erbs, Mike van der Schaar, Miriam Marmontel, Marina Gaona, Emiliano Ramalho, Michel André","doi":"10.1002/rse2.418","DOIUrl":"https://doi.org/10.1002/rse2.418","url":null,"abstract":"For many species at risk, monitoring challenges related to low visual detectability and elusive behavior limit the use of traditional visual surveys to collect critical information, hindering the development of sound conservation strategies. Passive acoustics can cost‐effectively acquire terrestrial and underwater long‐term data. However, to extract valuable information from large datasets, automatic methods need to be developed, tested and applied. Combining passive acoustics with deep learning models, we developed a method to monitor the secretive Amazonian manatee over two consecutive flooded seasons in the Brazilian Amazon floodplains. Subsequently, we investigated the vocal behavior parameters based on vocalization frequencies and temporal characteristics in the context of habitat use. A Convolutional Neural Network model successfully detected Amazonian manatee vocalizations with a 0.98 average precision on training data. Similar classification performance in terms of precision (range: 0.83–1.00) and recall (range: 0.97–1.00) was achieved for each year. Using this model, we evaluated manatee acoustic presence over a total of 226 days comprising recording periods in 2021 and 2022. Manatee vocalizations were consistently detected during both years, reaching 94% daily temporal occurrence in 2021, and up to 11 h a day with detections during peak presence. Manatee calls were characterized by a high emphasized frequency and high repetition rate, being mostly produced in rapid sequences. This vocal behavior strongly indicates an exchange between females and their calves. Combining passive acoustic monitoring with deep learning models, and extending temporal monitoring and increasing species detectability, we demonstrated that the approach can be used to identify manatee core habitats according to seasonality. The combined method represents a reliable, cost‐effective, scalable ecological monitoring technique that can be integrated into long‐term, standardized survey protocols of aquatic species. It can considerably benefit the monitoring of inaccessible regions, such as the Amazonian freshwater systems, which are facing immediate threats from increased hydropower construction.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":null,"pages":null},"PeriodicalIF":5.5,"publicationDate":"2024-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142561964","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Juliette Picard, Maïalicah M. Nungi‐Pambu Dembi, Nicolas Barbier, Guillaume Cornu, Pierre Couteron, Eric Forni, Gwili Gibbon, Felix Lim, Pierre Ploton, Robin Pouteau, Paul Tresson, Tom van Loon, Gaëlle Viennois, Maxime Réjou‐Méchain
Tropical moist forests are not the homogeneous green carpet often illustrated in maps or considered by global models. They harbour a complex mixture of forest types organized at different spatial scales that can now be more accurately mapped thanks to remote sensing products and artificial intelligence. In this study, we built a large‐scale vegetation map of the North of Congo and assessed the environmental drivers of the main forest types, their forest structure, their floristic and functional compositions and their faunistic composition. To build the map, we used Sentinel‐2 satellite images and recent deep learning architectures. We tested the effect of topographically determined water availability on vegetation type distribution by linking the map with a water drainage depth proxy (HAND, height above the nearest drainage index). We also described vegetation type structure and composition (floristic, functional and associated fauna) by linking the map with data from large inventories and derived from satellite images. We found that water drainage depth is a major driver of forest type distribution and that the different forest types are characterized by different structure, composition and functions, bringing new insights about their origins and successional dynamics. We discuss not only the crucial role of soil–water depth, but also the importance of consistently reproducing such maps through time to develop an accurate monitoring of tropical forest types and functions, and we provide insights on peculiar forest types (Marantaceae forests and monodominant Gilbertiodendron forests) on which future studies should focus more. Under the current context of global change, expected to trigger major forest structural and compositional changes in the tropics, an appropriate monitoring strategy of the spatio‐temporal dynamics of forest types and their associated floristic and faunistic composition would considerably help anticipate detrimental shifts.
{"title":"Combining satellite and field data reveals Congo's forest types structure, functioning and composition","authors":"Juliette Picard, Maïalicah M. Nungi‐Pambu Dembi, Nicolas Barbier, Guillaume Cornu, Pierre Couteron, Eric Forni, Gwili Gibbon, Felix Lim, Pierre Ploton, Robin Pouteau, Paul Tresson, Tom van Loon, Gaëlle Viennois, Maxime Réjou‐Méchain","doi":"10.1002/rse2.419","DOIUrl":"https://doi.org/10.1002/rse2.419","url":null,"abstract":"Tropical moist forests are not the homogeneous green carpet often illustrated in maps or considered by global models. They harbour a complex mixture of forest types organized at different spatial scales that can now be more accurately mapped thanks to remote sensing products and artificial intelligence. In this study, we built a large‐scale vegetation map of the North of Congo and assessed the environmental drivers of the main forest types, their forest structure, their floristic and functional compositions and their faunistic composition. To build the map, we used Sentinel‐2 satellite images and recent deep learning architectures. We tested the effect of topographically determined water availability on vegetation type distribution by linking the map with a water drainage depth proxy (HAND, height above the nearest drainage index). We also described vegetation type structure and composition (floristic, functional and associated fauna) by linking the map with data from large inventories and derived from satellite images. We found that water drainage depth is a major driver of forest type distribution and that the different forest types are characterized by different structure, composition and functions, bringing new insights about their origins and successional dynamics. We discuss not only the crucial role of soil–water depth, but also the importance of consistently reproducing such maps through time to develop an accurate monitoring of tropical forest types and functions, and we provide insights on peculiar forest types (Marantaceae forests and monodominant <jats:italic>Gilbertiodendron</jats:italic> forests) on which future studies should focus more. Under the current context of global change, expected to trigger major forest structural and compositional changes in the tropics, an appropriate monitoring strategy of the spatio‐temporal dynamics of forest types and their associated floristic and faunistic composition would considerably help anticipate detrimental shifts.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":null,"pages":null},"PeriodicalIF":5.5,"publicationDate":"2024-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142430421","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sarah Smith‐Tripp, Nicholas C. Coops, Christopher Mulverhill, Joanne C. White, Sarah Gergel
Western North America has seen a recent dramatic increase in large and often high‐severity wildfires. After forest fire, understanding patterns of structural recovery is important, as recovery patterns impact critical ecosystem services. Continuous forest monitoring provided by satellite observations is particularly beneficial to capture the pivotal post‐fire period when forest recovery begins. However, it is challenging to optimize optical satellite imagery to both interpolate current and extrapolate future forest structure and composition. We identified a need to understand how early spectral dynamics (5 years post‐fire) inform patterns of structural recovery after fire disturbance. To create these structural patterns, we collected metrics of forest structure using high‐density Remotely Piloted Aircraft (RPAS) lidar (light detection and ranging). We employed a space‐for‐time substitution in the highly fire‐disturbed forests of interior British Columbia. In this region, we collected RPAS lidar and corresponding field plot data 5‐, 8‐, 11‐,12‐, and 16‐years postfire to predict structural attributes relevant to management, including the percent bare ground, the proportion of coniferous trees, stem density, and basal area. We compared forest structural attributes with unique early spectral responses, or trajectories, derived from Landsat time series data 5 years after fire. A total of eight unique spectral recovery trajectories were identified from spectral responses of seven vegetation indices (NBR, NDMI, NDVI, TCA, TCB, TCG, and TCW) that described five distinct patterns of structural recovery captured with RPAS lidar. Two structural patterns covered more than 80% of the study area. Both patterns had strong coniferous regrowth, but one had a higher basal area with more bare ground and the other pattern had a high stem density, but a low basal area and a higher deciduous proportion. Our approach highlights the ability to use early spectral responses to capture unique spectral trajectories and their associated distinct structural recovery patterns.
{"title":"Early spectral dynamics are indicative of distinct growth patterns in post‐wildfire forests","authors":"Sarah Smith‐Tripp, Nicholas C. Coops, Christopher Mulverhill, Joanne C. White, Sarah Gergel","doi":"10.1002/rse2.420","DOIUrl":"https://doi.org/10.1002/rse2.420","url":null,"abstract":"Western North America has seen a recent dramatic increase in large and often high‐severity wildfires. After forest fire, understanding patterns of structural recovery is important, as recovery patterns impact critical ecosystem services. Continuous forest monitoring provided by satellite observations is particularly beneficial to capture the pivotal post‐fire period when forest recovery begins. However, it is challenging to optimize optical satellite imagery to both interpolate current and extrapolate future forest structure and composition. We identified a need to understand how early spectral dynamics (5 years post‐fire) inform patterns of structural recovery after fire disturbance. To create these structural patterns, we collected metrics of forest structure using high‐density Remotely Piloted Aircraft (RPAS) lidar (light detection and ranging). We employed a space‐for‐time substitution in the highly fire‐disturbed forests of interior British Columbia. In this region, we collected RPAS lidar and corresponding field plot data 5‐, 8‐, 11‐,12‐, and 16‐years postfire to predict structural attributes relevant to management, including the percent bare ground, the proportion of coniferous trees, stem density, and basal area. We compared forest structural attributes with unique early spectral responses, or trajectories, derived from Landsat time series data 5 years after fire. A total of eight unique spectral recovery trajectories were identified from spectral responses of seven vegetation indices (NBR, NDMI, NDVI, TCA, TCB, TCG, and TCW) that described five distinct patterns of structural recovery captured with RPAS lidar. Two structural patterns covered more than 80% of the study area. Both patterns had strong coniferous regrowth, but one had a higher basal area with more bare ground and the other pattern had a high stem density, but a low basal area and a higher deciduous proportion. Our approach highlights the ability to use early spectral responses to capture unique spectral trajectories and their associated distinct structural recovery patterns.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":null,"pages":null},"PeriodicalIF":5.5,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142245852","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rodrigo V. Leite, Cibele Amaral, Christopher S. R. Neigh, Diogo N. Cosenza, Carine Klauberg, Andrew T. Hudak, Luiz Aragão, Douglas C. Morton, Shane Coffield, Tempest McCabe, Carlos A. Silva
Managing fuels is a key strategy for mitigating the negative impacts of wildfires on people and the environment. The use of satellite‐based Earth observation data has become an important tool for managers to optimize fuel treatment planning at regional scales. Fortunately, several new sensors have been launched in the last few years, providing novel opportunities to enhance fuel characterization. Herein, we summarize the potential improvements in fuel characterization at large scale (i.e., hundreds to thousands of km2) with high spatial and spectral resolution arising from the use of new spaceborne instruments with near‐global, freely‐available data. We identified sensors at spatial resolutions suitable for fuel treatment planning, featuring: lidar data for characterizing vegetation structure; hyperspectral sensors for retrieving chemical compounds and species composition; and dense time series derived from multispectral and synthetic aperture radar sensors for mapping phenology and moisture dynamics. We also highlight future hyperspectral and radar missions that will deliver valuable and complementary information for a new era of fuel load characterization from space. The data volume that is being generated may still challenge the usability by a diverse group of stakeholders. Seamless cyberinfrastructure and community engagement are paramount to guarantee the use of these cutting‐edge datasets for fuel monitoring and wildland fire management across the world.
{"title":"Leveraging the next generation of spaceborne Earth observations for fuel monitoring and wildland fire management","authors":"Rodrigo V. Leite, Cibele Amaral, Christopher S. R. Neigh, Diogo N. Cosenza, Carine Klauberg, Andrew T. Hudak, Luiz Aragão, Douglas C. Morton, Shane Coffield, Tempest McCabe, Carlos A. Silva","doi":"10.1002/rse2.416","DOIUrl":"https://doi.org/10.1002/rse2.416","url":null,"abstract":"Managing fuels is a key strategy for mitigating the negative impacts of wildfires on people and the environment. The use of satellite‐based Earth observation data has become an important tool for managers to optimize fuel treatment planning at regional scales. Fortunately, several new sensors have been launched in the last few years, providing novel opportunities to enhance fuel characterization. Herein, we summarize the potential improvements in fuel characterization at large scale (i.e., hundreds to thousands of km<jats:sup>2</jats:sup>) with high spatial and spectral resolution arising from the use of new spaceborne instruments with near‐global, freely‐available data. We identified sensors at spatial resolutions suitable for fuel treatment planning, featuring: lidar data for characterizing vegetation structure; hyperspectral sensors for retrieving chemical compounds and species composition; and dense time series derived from multispectral and synthetic aperture radar sensors for mapping phenology and moisture dynamics. We also highlight future hyperspectral and radar missions that will deliver valuable and complementary information for a new era of fuel load characterization from space. The data volume that is being generated may still challenge the usability by a diverse group of stakeholders. Seamless cyberinfrastructure and community engagement are paramount to guarantee the use of these cutting‐edge datasets for fuel monitoring and wildland fire management across the world.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":null,"pages":null},"PeriodicalIF":5.5,"publicationDate":"2024-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141998774","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jenny Bueno, Sarah E. Lester, Joshua L. Breithaupt, Sandra Brooke
The eastern oyster (Crassostrea virginica) is a coastal foundation species currently under threat from anthropogenic activities both globally and in the Apalachicola Bay region of north Florida. Oysters provide numerous ecosystem services, and it is important to establish efficient and reliable methods for their effective monitoring and management. Traditional monitoring techniques, such as quadrat density sampling, can be labor‐intensive, destructive of both oysters and reefs, and may be spatially limited. In this study, we demonstrate how unoccupied aerial systems (UAS) can be used to efficiently generate high‐resolution geospatial oyster reef condition data over large areas. These data, with appropriate ground truthing and minimal destructive sampling, can be used to effectively monitor the size and abundance of oyster clusters on intertidal reefs. Utilizing structure‐from‐motion photogrammetry techniques to create three‐dimensional topographic models, we reconstructed the distribution, spatial density and size of oyster clusters on intertidal reefs in Apalachicola Bay. Ground truthing revealed 97% accuracy for cluster presence detection by UAS products and we confirmed that live oysters are predominately located within clusters, supporting the use of cluster features to estimate oyster population status. We found a positive significant relationship between cluster size and live oyster counts. These findings allowed us to extract clusters from geospatial products and predict live oyster abundance and spatial density on 138 reefs covering 138 382 m2 over two locations. Oyster densities varied between sites, with higher live oyster densities occurring at one site within the Apalachicola Bay bounds, and lower oyster densities in areas adjacent to Apalachicola Bay. Repeated monitoring at one site in 2022 and 2023 revealed a relatively stable oyster density over time. This study demonstrated the successful application of high‐resolution drone imagery combined with cluster sampling, providing a repeatable method for mapping and monitoring to inform conservation, restoration and management strategies for intertidal oyster populations.
{"title":"The application of unoccupied aerial systems (UAS) for monitoring intertidal oyster density and abundance","authors":"Jenny Bueno, Sarah E. Lester, Joshua L. Breithaupt, Sandra Brooke","doi":"10.1002/rse2.417","DOIUrl":"https://doi.org/10.1002/rse2.417","url":null,"abstract":"The eastern oyster (<jats:italic>Crassostrea virginica</jats:italic>) is a coastal foundation species currently under threat from anthropogenic activities both globally and in the Apalachicola Bay region of north Florida. Oysters provide numerous ecosystem services, and it is important to establish efficient and reliable methods for their effective monitoring and management. Traditional monitoring techniques, such as quadrat density sampling, can be labor‐intensive, destructive of both oysters and reefs, and may be spatially limited. In this study, we demonstrate how unoccupied aerial systems (UAS) can be used to efficiently generate high‐resolution geospatial oyster reef condition data over large areas. These data, with appropriate ground truthing and minimal destructive sampling, can be used to effectively monitor the size and abundance of oyster clusters on intertidal reefs. Utilizing structure‐from‐motion photogrammetry techniques to create three‐dimensional topographic models, we reconstructed the distribution, spatial density and size of oyster clusters on intertidal reefs in Apalachicola Bay. Ground truthing revealed 97% accuracy for cluster presence detection by UAS products and we confirmed that live oysters are predominately located within clusters, supporting the use of cluster features to estimate oyster population status. We found a positive significant relationship between cluster size and live oyster counts. These findings allowed us to extract clusters from geospatial products and predict live oyster abundance and spatial density on 138 reefs covering 138 382 m<jats:sup>2</jats:sup> over two locations. Oyster densities varied between sites, with higher live oyster densities occurring at one site within the Apalachicola Bay bounds, and lower oyster densities in areas adjacent to Apalachicola Bay. Repeated monitoring at one site in 2022 and 2023 revealed a relatively stable oyster density over time. This study demonstrated the successful application of high‐resolution drone imagery combined with cluster sampling, providing a repeatable method for mapping and monitoring to inform conservation, restoration and management strategies for intertidal oyster populations.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":null,"pages":null},"PeriodicalIF":5.5,"publicationDate":"2024-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141980647","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chiara Aquino, Edward T. A. Mitchard, Iain M. McNicol, Harry Carstairs, Andrew Burt, Beisit L. P. Vilca, Sylvia Mayta, Mathias Disney
Selective logging is known to be widespread in the tropics, but is currently very poorly mapped, in part because there is little quantitative data on which satellite sensor characteristics and analysis methods are best at detecting it. To improve this, we used data from the Tropical Forest Degradation Experiment (FODEX) plots in the southern Peruvian Amazon, where different numbers of trees had been removed from four plots of 1 ha each, carefully inventoried by hand and terrestrial laser scanning before and after the logging to give a range of biomass loss (∆AGB) values. We conducted a comparative study of six multispectral optical satellite sensors at 0.3–30 m spatial resolution, to find the best combination of sensor and remote sensing indicator for change detection. Spectral reflectance, the normalised difference vegetation index (NDVI) and texture parameters were extracted after radiometric calibration and image preprocessing. The strength of the relationships between the change in these values and field‐measured ∆AGB (computed in % ha−1) was analysed. The results demonstrate that: (a) texture measures correlates more with ∆AGB than simple spectral parameters; (b) the strongest correlations are achieved for those sensors with spatial resolutions in the intermediate range (1.5–10 m), with finer or coarser resolutions producing worse results, and (c) when texture is computed using a moving square window ranging between 9 and 14 m in length. Maps predicting ∆AGB showed very promising results using a NIR‐derived texture parameter for 3 m resolution PlanetScope (R2 = 0.97 and root mean square error (RMSE) = 1.91% ha−1), followed by 1.5 m SPOT‐7 (R2 = 0.76 and RMSE = 5.06% ha−1) and 10 m Sentinel‐2 (R2 = 0.79 and RMSE = 4.77% ha−1). Our findings imply that, at least for lowland Peru, low‐medium intensity disturbance can be detected best in optical wavelengths using a texture measure derived from 3 m PlanetScope data.
{"title":"Detecting selective logging in tropical forests with optical satellite data: an experiment in Peru shows texture at 3 m gives the best results","authors":"Chiara Aquino, Edward T. A. Mitchard, Iain M. McNicol, Harry Carstairs, Andrew Burt, Beisit L. P. Vilca, Sylvia Mayta, Mathias Disney","doi":"10.1002/rse2.414","DOIUrl":"https://doi.org/10.1002/rse2.414","url":null,"abstract":"Selective logging is known to be widespread in the tropics, but is currently very poorly mapped, in part because there is little quantitative data on which satellite sensor characteristics and analysis methods are best at detecting it. To improve this, we used data from the Tropical Forest Degradation Experiment (FODEX) plots in the southern Peruvian Amazon, where different numbers of trees had been removed from four plots of 1 ha each, carefully inventoried by hand and terrestrial laser scanning before and after the logging to give a range of biomass loss (∆AGB) values. We conducted a comparative study of six multispectral optical satellite sensors at 0.3–30 m spatial resolution, to find the best combination of sensor and remote sensing indicator for change detection. Spectral reflectance, the normalised difference vegetation index (NDVI) and texture parameters were extracted after radiometric calibration and image preprocessing. The strength of the relationships between the change in these values and field‐measured ∆AGB (computed in % ha<jats:sup>−1</jats:sup>) was analysed. The results demonstrate that: (a) texture measures correlates more with ∆AGB than simple spectral parameters; (b) the strongest correlations are achieved for those sensors with spatial resolutions in the intermediate range (1.5–10 m), with finer or coarser resolutions producing worse results, and (c) when texture is computed using a moving square window ranging between 9 and 14 m in length. Maps predicting ∆AGB showed very promising results using a NIR‐derived texture parameter for 3 m resolution PlanetScope (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.97 and root mean square error (RMSE) = 1.91% ha<jats:sup>−1</jats:sup>), followed by 1.5 m SPOT‐7 (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.76 and RMSE = 5.06% ha<jats:sup>−1</jats:sup>) and 10 m Sentinel‐2 (<jats:italic>R</jats:italic><jats:sup>2</jats:sup> = 0.79 and RMSE = 4.77% ha<jats:sup>−1</jats:sup>). Our findings imply that, at least for lowland Peru, low‐medium intensity disturbance can be detected best in optical wavelengths using a texture measure derived from 3 m PlanetScope data.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":null,"pages":null},"PeriodicalIF":5.5,"publicationDate":"2024-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141862350","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cate Ryan, Hannah L. Buckley, Craig D. Bishop, Graham Hinchliffe, Bradley C. Case
Coastal active dunes provide vital biodiversity, habitat, and ecosystem services, yet they are one of the most endangered and understudied ecosystems worldwide. Therefore, monitoring the status of these systems is essential, but field vegetation surveys are time‐consuming and expensive. Remotely sensed aerial imagery offers spatially continuous, low‐cost, high‐resolution coverage, allowing for vegetation mapping across larger areas than traditional field surveys. Taking Aotearoa New Zealand as a case study, we used a nationally representative sample of coastal active dunes to classify vegetation from red‐green‐blue (RGB) high‐resolution (0.075–0.75 m) aerial imagery with object‐based image analysis. The mean overall accuracy was 0.76 across 21 beaches for aggregated classes, and key cover classes, such as sand, sandbinders, and woody vegetation, were discerned. However, differentiation among woody vegetation species on semi‐stable and stable dunes posed a challenge. We developed a national cover typology from the classification, comprising seven vegetation types. Classification tree models showed that where human activity was higher, it was more important than geomorphic factors in influencing the relative percent cover of the different active dune cover classes. Our methods provide a quantitative approach to characterizing the cover classes on active dunes at a national scale, which are relevant for conservation management, including habitat mapping, determining species occupancy, indigenous dominance, and the representativeness of remaining active dunes.
{"title":"Quantifying vegetation cover on coastal active dunes using nationwide aerial image analysis","authors":"Cate Ryan, Hannah L. Buckley, Craig D. Bishop, Graham Hinchliffe, Bradley C. Case","doi":"10.1002/rse2.410","DOIUrl":"https://doi.org/10.1002/rse2.410","url":null,"abstract":"Coastal active dunes provide vital biodiversity, habitat, and ecosystem services, yet they are one of the most endangered and understudied ecosystems worldwide. Therefore, monitoring the status of these systems is essential, but field vegetation surveys are time‐consuming and expensive. Remotely sensed aerial imagery offers spatially continuous, low‐cost, high‐resolution coverage, allowing for vegetation mapping across larger areas than traditional field surveys. Taking Aotearoa New Zealand as a case study, we used a nationally representative sample of coastal active dunes to classify vegetation from red‐green‐blue (RGB) high‐resolution (0.075–0.75 m) aerial imagery with object‐based image analysis. The mean overall accuracy was 0.76 across 21 beaches for aggregated classes, and key cover classes, such as sand, sandbinders, and woody vegetation, were discerned. However, differentiation among woody vegetation species on semi‐stable and stable dunes posed a challenge. We developed a national cover typology from the classification, comprising seven vegetation types. Classification tree models showed that where human activity was higher, it was more important than geomorphic factors in influencing the relative percent cover of the different active dune cover classes. Our methods provide a quantitative approach to characterizing the cover classes on active dunes at a national scale, which are relevant for conservation management, including habitat mapping, determining species occupancy, indigenous dominance, and the representativeness of remaining active dunes.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":null,"pages":null},"PeriodicalIF":5.5,"publicationDate":"2024-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141631631","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mimi Arandjelovic, Colleen R. Stephens, Paula Dieguez, Nuria Maldonado, Gaëlle Bocksberger, Marie‐Lyne Després‐Einspenner, Benjamin Debetencourt, Vittoria Estienne, Ammie K. Kalan, Maureen S. McCarthy, Anne‐Céline Granjon, Veronika Städele, Briana Harder, Lucia Hacker, Anja Landsmann, Laura K. Lynn, Heidi Pfund, Zuzana Ročkaiová, Kristeena Sigler, Jane Widness, Heike Wilken, Antonio Buzharevski, Adeelia S. Goffe, Kristin Havercamp, Lydia L. Luncz, Giulia Sirianni, Erin G. Wessling, Roman M. Wittig, Christophe Boesch, Hjalmar S. Kühl
As camera trapping grows in popularity and application, some analytical limitations persist including processing time and accuracy of data annotation. Typically images are recorded by camera traps although videos are becoming increasingly collected even though they require much more time for annotation. To overcome limitations with image annotation, camera trap studies are increasingly linked to community science (CS) platforms. Here, we extend previous work on CS image annotations to camera trap videos from a challenging environment; a dense tropical forest with low visibility and high occlusion due to thick canopy cover and bushy undergrowth at the camera level. Using the CS platform Chimp&See, established for classification of 599 956 video clips from tropical Africa, we assess annotation precision and accuracy by comparing classification of 13 531 1‐min video clips by a professional ecologist (PE) with output from 1744 registered, as well as unregistered, Chimp&See community scientists. We considered 29 classification categories, including 17 species and 12 higher‐level categories, in which phenotypically similar species were grouped. Overall, annotation precision was 95.4%, which increased to 98.2% when aggregating similar species groups together. Our findings demonstrate the competence of community scientists working with camera trap videos from even challenging environments and hold great promise for future studies on animal behaviour, species interaction dynamics and population monitoring.
{"title":"Highly precise community science annotations of video camera‐trapped fauna in challenging environments","authors":"Mimi Arandjelovic, Colleen R. Stephens, Paula Dieguez, Nuria Maldonado, Gaëlle Bocksberger, Marie‐Lyne Després‐Einspenner, Benjamin Debetencourt, Vittoria Estienne, Ammie K. Kalan, Maureen S. McCarthy, Anne‐Céline Granjon, Veronika Städele, Briana Harder, Lucia Hacker, Anja Landsmann, Laura K. Lynn, Heidi Pfund, Zuzana Ročkaiová, Kristeena Sigler, Jane Widness, Heike Wilken, Antonio Buzharevski, Adeelia S. Goffe, Kristin Havercamp, Lydia L. Luncz, Giulia Sirianni, Erin G. Wessling, Roman M. Wittig, Christophe Boesch, Hjalmar S. Kühl","doi":"10.1002/rse2.402","DOIUrl":"https://doi.org/10.1002/rse2.402","url":null,"abstract":"As camera trapping grows in popularity and application, some analytical limitations persist including processing time and accuracy of data annotation. Typically images are recorded by camera traps although videos are becoming increasingly collected even though they require much more time for annotation. To overcome limitations with image annotation, camera trap studies are increasingly linked to community science (CS) platforms. Here, we extend previous work on CS image annotations to camera trap videos from a challenging environment; a dense tropical forest with low visibility and high occlusion due to thick canopy cover and bushy undergrowth at the camera level. Using the CS platform Chimp&See, established for classification of 599 956 video clips from tropical Africa, we assess annotation precision and accuracy by comparing classification of 13 531 1‐min video clips by a professional ecologist (PE) with output from 1744 registered, as well as unregistered, Chimp&See community scientists. We considered 29 classification categories, including 17 species and 12 higher‐level categories, in which phenotypically similar species were grouped. Overall, annotation precision was 95.4%, which increased to 98.2% when aggregating similar species groups together. Our findings demonstrate the competence of community scientists working with camera trap videos from even challenging environments and hold great promise for future studies on animal behaviour, species interaction dynamics and population monitoring.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":null,"pages":null},"PeriodicalIF":5.5,"publicationDate":"2024-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141452967","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daire Carroll, Eduardo Infantes, Eva V. Pagan, Karin C. Harding
Body mass is a fundamental indicator of animal health closely linked to survival and reproductive success. Systematic assessment of body mass for a large proportion of a population can allow early detection of changes likely to impact population growth, facilitating responsive management and a mechanistic understanding of ecological trends. One challenge with integrating body mass assessment into monitoring is sampling enough animals to detect trends and account for individual variation. Harbour seals (Phoca vitulina) are philopatric marine mammals responsive to regional environmental changes, resulting in their use as an indicator species. We present a novel method for the non‐invasive and semi‐automatic assessment of harbour seal body condition, using unoccupied aerial vehicles (UAVs/drones). Morphological parameters are automatically measured in georeferenced images and used to estimate volume, which is then translated to estimated mass. Remote observations of known individuals are utilized to calibrate the method. We achieve a high level of accuracy (mean absolute error of 4.5 kg or 10.5% for all seals and 3.2 kg or 12.7% for pups‐of‐the‐year). We systematically apply the method to wild seals during the Spring pupping season and Autumn over 2 years, achieving a near‐population‐level assessment for pups on land (82.5% measured). With reference to previous mark‐recapture work linking Autumn pup weights to survival, we estimate mean expected probability of over‐winter survival (mean = 0.89, standard deviation = 0.08). This work marks a significant step forward for the non‐invasive assessment of body condition in pinnipeds and could provide daily estimates of body mass for thousands of individuals. It can act as an early warning for deteriorating environmental conditions and be utilized as an integrative tool for wildlife monitoring. It also enables estimation of yearly variation in demographic rates which can be utilized in parameterizing models of population growth with relevance for conservation and evolutionary biology.
{"title":"Approaching a population‐level assessment of body size in pinnipeds using drones, an early warning of environmental degradation","authors":"Daire Carroll, Eduardo Infantes, Eva V. Pagan, Karin C. Harding","doi":"10.1002/rse2.413","DOIUrl":"https://doi.org/10.1002/rse2.413","url":null,"abstract":"Body mass is a fundamental indicator of animal health closely linked to survival and reproductive success. Systematic assessment of body mass for a large proportion of a population can allow early detection of changes likely to impact population growth, facilitating responsive management and a mechanistic understanding of ecological trends. One challenge with integrating body mass assessment into monitoring is sampling enough animals to detect trends and account for individual variation. Harbour seals (<jats:italic>Phoca vitulina</jats:italic>) are philopatric marine mammals responsive to regional environmental changes, resulting in their use as an indicator species. We present a novel method for the non‐invasive and semi‐automatic assessment of harbour seal body condition, using unoccupied aerial vehicles (UAVs/drones). Morphological parameters are automatically measured in georeferenced images and used to estimate volume, which is then translated to estimated mass. Remote observations of known individuals are utilized to calibrate the method. We achieve a high level of accuracy (mean absolute error of 4.5 kg or 10.5% for all seals and 3.2 kg or 12.7% for pups‐of‐the‐year). We systematically apply the method to wild seals during the Spring pupping season and Autumn over 2 years, achieving a near‐population‐level assessment for pups on land (82.5% measured). With reference to previous mark‐recapture work linking Autumn pup weights to survival, we estimate mean expected probability of over‐winter survival (mean = 0.89, standard deviation = 0.08). This work marks a significant step forward for the non‐invasive assessment of body condition in pinnipeds and could provide daily estimates of body mass for thousands of individuals. It can act as an early warning for deteriorating environmental conditions and be utilized as an integrative tool for wildlife monitoring. It also enables estimation of yearly variation in demographic rates which can be utilized in parameterizing models of population growth with relevance for conservation and evolutionary biology.","PeriodicalId":21132,"journal":{"name":"Remote Sensing in Ecology and Conservation","volume":null,"pages":null},"PeriodicalIF":5.5,"publicationDate":"2024-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141452988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}