Rongrong Angkaew, Naruemon Tantipisanuh, Dusit Ngoprasert, Larkin A. Powell, Wich'yanan Limparungpatthanakij, Philip D. Round, George A. Gale
Effective conservation management relies on accessing and integrating various forms of evidence regarding the potential effects of management interventions. Here, we aim to identify key management options to enhance habitat suitability and mitigate threats for grassland and farmland birds in the Central Plains of Thailand, a key area for open-country birds in the region, using a Bayesian Belief Network (BBN) approach. We selected eight at-risk passerine landbird species as focal taxa and developed up to nine scenarios to assess the potential impacts on the area of available suitable habitat for each species under different management options: a status quo scenario depicting the current situation, a future scenario if no action is taken, and up to seven scenarios each with management options. Three options focused on improving and/or maintaining habitat suitability, and the other four targeted threat mitigation. We then sought the best combination of management options, based on results from the above scenarios. The models predicted that each species would respond differently to each option depending on their ecological niches. If no action is taken in the near future, the highest quality habitats for all species were predicted to decrease from the current situation, with some species facing substantial habitat loss. For example, the globally Vulnerable Manchurian reed warbler Acrocephalus tangorum was predicted to lose nearly all of its highest suitability habitats (a 93% decline). The best conservation strategy involved implementing multiple management options, with tax incentives playing a particularly important role—and being the most effective measure for four species and the second most effective for the remaining four. Species-specific responses varied; two species required fewer interventions, while others needed multiple concurrent management strategies. For instance, the highest suitability areas for the Manchurian reed warbler and Oriental skylark Alauda gulgula reached an asymptote when two management options were applied together, whereas species like the long-tailed shrike Lanius schach required four interventions simultaneously. Our study underscores the advantages of this BBN approach for prioritizing optimal management strategies before implementation. It is adaptable for various decision-making processes and can be applied to other species and agricultural systems, particularly those lacking baseline data.
{"title":"Exploring management strategies for open-country birds: A case study from a rice-dominated landscape","authors":"Rongrong Angkaew, Naruemon Tantipisanuh, Dusit Ngoprasert, Larkin A. Powell, Wich'yanan Limparungpatthanakij, Philip D. Round, George A. Gale","doi":"10.1002/ecs2.70499","DOIUrl":"https://doi.org/10.1002/ecs2.70499","url":null,"abstract":"<p>Effective conservation management relies on accessing and integrating various forms of evidence regarding the potential effects of management interventions. Here, we aim to identify key management options to enhance habitat suitability and mitigate threats for grassland and farmland birds in the Central Plains of Thailand, a key area for open-country birds in the region, using a Bayesian Belief Network (BBN) approach. We selected eight at-risk passerine landbird species as focal taxa and developed up to nine scenarios to assess the potential impacts on the area of available suitable habitat for each species under different management options: a status quo scenario depicting the current situation, a future scenario if no action is taken, and up to seven scenarios each with management options. Three options focused on improving and/or maintaining habitat suitability, and the other four targeted threat mitigation. We then sought the best combination of management options, based on results from the above scenarios. The models predicted that each species would respond differently to each option depending on their ecological niches. If no action is taken in the near future, the highest quality habitats for all species were predicted to decrease from the current situation, with some species facing substantial habitat loss. For example, the globally Vulnerable Manchurian reed warbler <i>Acrocephalus tangorum</i> was predicted to lose nearly all of its highest suitability habitats (a 93% decline). The best conservation strategy involved implementing multiple management options, with tax incentives playing a particularly important role—and being the most effective measure for four species and the second most effective for the remaining four. Species-specific responses varied; two species required fewer interventions, while others needed multiple concurrent management strategies. For instance, the highest suitability areas for the Manchurian reed warbler and Oriental skylark <i>Alauda gulgula</i> reached an asymptote when two management options were applied together, whereas species like the long-tailed shrike <i>Lanius schach</i> required four interventions simultaneously. Our study underscores the advantages of this BBN approach for prioritizing optimal management strategies before implementation. It is adaptable for various decision-making processes and can be applied to other species and agricultural systems, particularly those lacking baseline data.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70499","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145750689","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Habitat modification is the main threat to biodiversity, but other causes are involved, such as climate change or pollution. To optimize conservation efforts, it is crucial to determine their respective contributions. An effective approach to disentangle causalities is to set up comparative or experimental monitoring in the field. In a temperate forest classified as an integral nature reserve, left unexploited, tree growth led to canopy closure and the progressive disappearance of semi-open woodlands and associated thermophilic species. Past logging provided a semi-experimental setting with contrasting patches in terms of canopy closure. A long-term capture–mark–recapture study of two snake species (1995–2021), one thermophilic and the other less so, revealed which demographic parameters were affected and established a link between habitat closure and population decline. For the most thermophilic species (Hierophis viridiflavus), rapid increases in canopy cover drastically reduced the survival probability of very young individuals, neonates and snakes a few months old, likely by depriving them of essential thermophilic prey (i.e., small lizards). Juveniles and adults that feed mainly on rodents and less on thermophilic lizards were less affected. For the least thermophilic snake species (Zamenis longissimus) where all age groups feed on rodents, habitat closure had no major effect on survival. Importantly, open woodlands remained favorable for both species. Although closing habitat was detrimental to H. viridiflavus, we predict that the survival of Z. longissimus will also drop as the forest closure approaches 100%. Canopy closure and the disappearance of shrubby habitats are the main factors responsible for the decline of forest snakes. A lack of management in temperate forests may not be the best option to maintain healthy reptile populations.
{"title":"Contrasted effects of temperate forest closure on snake demography","authors":"G. Billy, C. Barbraud, X. Bonnet","doi":"10.1002/ecs2.70468","DOIUrl":"https://doi.org/10.1002/ecs2.70468","url":null,"abstract":"<p>Habitat modification is the main threat to biodiversity, but other causes are involved, such as climate change or pollution. To optimize conservation efforts, it is crucial to determine their respective contributions. An effective approach to disentangle causalities is to set up comparative or experimental monitoring in the field. In a temperate forest classified as an integral nature reserve, left unexploited, tree growth led to canopy closure and the progressive disappearance of semi-open woodlands and associated thermophilic species. Past logging provided a semi-experimental setting with contrasting patches in terms of canopy closure. A long-term capture–mark–recapture study of two snake species (1995–2021), one thermophilic and the other less so, revealed which demographic parameters were affected and established a link between habitat closure and population decline. For the most thermophilic species (<i>Hierophis viridiflavus</i>), rapid increases in canopy cover drastically reduced the survival probability of very young individuals, neonates and snakes a few months old, likely by depriving them of essential thermophilic prey (i.e., small lizards). Juveniles and adults that feed mainly on rodents and less on thermophilic lizards were less affected. For the least thermophilic snake species (<i>Zamenis longissimus</i>) where all age groups feed on rodents, habitat closure had no major effect on survival. Importantly, open woodlands remained favorable for both species. Although closing habitat was detrimental to <i>H. viridiflavus</i>, we predict that the survival of <i>Z. longissimus</i> will also drop as the forest closure approaches 100%. Canopy closure and the disappearance of shrubby habitats are the main factors responsible for the decline of forest snakes. A lack of management in temperate forests may not be the best option to maintain healthy reptile populations.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70468","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145750639","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Felix Deiß, Axel Ssymank, Steffen Caspari, Christoph Scherber
Europe's crucial pollinators, namely bees, hoverflies, butterflies, and moths, are experiencing strong population declines, increasingly placing them on the political agenda. Because of their long recording history, well-known biology, and suitability as indicators of anthropogenic pressures, Lepidoptera (butterflies and moths) have partially been used as surrogates for pollinators, forming an important part of future European Union (EU) pollinator monitoring. Before such monitoring is in place, designing effective conservation measures relies on a sufficient data basis. Here, focusing on Germany as a representative Central European country, we assessed the geographical coverage and accuracy for butterfly and moth data in a literature review. We (1) assess the spatial coverage of scientific Lepidoptera data regarding regions and landscape types in Germany, (2) test whether geographic data accuracy is influenced by the journal's ranking and requirements, and (3) identify temporal trends in geographic data accuracy. This resulted in 183 peer-reviewed studies sampling Lepidoptera in Germany over the last 30 years. Our findings suggest a pronounced spatial bias with a higher concentration of studies in southern and western federal states and important landscape classes, such as non-irrigated arable land or pastures, being severely underrepresented. Furthermore, we demonstrate that only 38% of the studies provided accurate information on the location of their sampling sites, with gradual improvements over recent years. Our results offer insights into current shortcomings in the scientific data landscape of butterflies and moths, two major pollinator groups, providing important impulses for the implementation of an EU Pollinator Monitoring Scheme in light of the EU Nature Restoration Regulation.
{"title":"Lack of spatial coordinate information for an important insect order (Lepidoptera) in a Central European country","authors":"Felix Deiß, Axel Ssymank, Steffen Caspari, Christoph Scherber","doi":"10.1002/ecs2.70502","DOIUrl":"https://doi.org/10.1002/ecs2.70502","url":null,"abstract":"<p>Europe's crucial pollinators, namely bees, hoverflies, butterflies, and moths, are experiencing strong population declines, increasingly placing them on the political agenda. Because of their long recording history, well-known biology, and suitability as indicators of anthropogenic pressures, Lepidoptera (butterflies and moths) have partially been used as surrogates for pollinators, forming an important part of future European Union (EU) pollinator monitoring. Before such monitoring is in place, designing effective conservation measures relies on a sufficient data basis. Here, focusing on Germany as a representative Central European country, we assessed the geographical coverage and accuracy for butterfly and moth data in a literature review. We (1) assess the spatial coverage of scientific Lepidoptera data regarding regions and landscape types in Germany, (2) test whether geographic data accuracy is influenced by the journal's ranking and requirements, and (3) identify temporal trends in geographic data accuracy. This resulted in 183 peer-reviewed studies sampling Lepidoptera in Germany over the last 30 years. Our findings suggest a pronounced spatial bias with a higher concentration of studies in southern and western federal states and important landscape classes, such as non-irrigated arable land or pastures, being severely underrepresented. Furthermore, we demonstrate that only 38% of the studies provided accurate information on the location of their sampling sites, with gradual improvements over recent years. Our results offer insights into current shortcomings in the scientific data landscape of butterflies and moths, two major pollinator groups, providing important impulses for the implementation of an EU Pollinator Monitoring Scheme in light of the EU Nature Restoration Regulation.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70502","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145750638","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mustapha Touray, Harun Cimen, Ibrahim Cakmak, Selcuk Hazir
Mycophagous invertebrates can significantly impact the efficacy of fungal biocontrol agents; yet the interaction between these agents and Sancassania polyphyllae (Acari: Acaridae), commonly found in soil ecosystems, remains poorly understood. Our study demonstrates that Sa. polyphyllae mites feed on fungus-infected insect cadavers as well as the mycelium and spores of Trichoderma afroharzianum and Metarhizium brunneum in pure cultures. Mite feeding activity was greater on Trichoderma than Metarhizium pure cultures, possibly due to Metarhizium's acaricidal effects, which impacted mite activity. Furthermore, mite feeding on fungus-infected insect cadavers caused visible damage to the integument. This feeding behavior significantly impacted fungal sporulation, a key factor in biocontrol efficacy. In both the M. brunneum-infected Galleria groups and the Tr. afroharzianum-infected Galleria groups, mite numbers increased over time, peaking around 9–11 days post-infection before slightly declining or plateauing. Notably, the fungi-infected insect tissue consistently exhibited significantly higher mite numbers than the pure cultures group at several time points. In dual-culture assays, Sa. polyphyllae mites preferentially fed on Fusarium oxysporum over Tr. afroharzianum. The presence of Fusarium may influence mite behavior and potentially reduce their impact on Trichoderma. This preference, possibly nutritional, requires further investigation. Consequently, Trichoderma's suppression of Fusarium in soil could significantly impact the food resources available to soil-dwelling mites like Sa. polyphyllae. Further research is needed to determine the nutritional basis of this feeding preference.
{"title":"Tri-trophic interactions of soil mite Sancassania polyphyllae (Acari: Acaridae) with fungal biocontrol agents","authors":"Mustapha Touray, Harun Cimen, Ibrahim Cakmak, Selcuk Hazir","doi":"10.1002/ecs2.70469","DOIUrl":"https://doi.org/10.1002/ecs2.70469","url":null,"abstract":"<p>Mycophagous invertebrates can significantly impact the efficacy of fungal biocontrol agents; yet the interaction between these agents and <i>Sancassania polyphyllae</i> (Acari: Acaridae), commonly found in soil ecosystems, remains poorly understood. Our study demonstrates that <i>Sa. polyphyllae</i> mites feed on fungus-infected insect cadavers as well as the mycelium and spores of <i>Trichoderma afroharzianum</i> and <i>Metarhizium brunneum</i> in pure cultures. Mite feeding activity was greater on <i>Trichoderma</i> than <i>Metarhizium</i> pure cultures, possibly due to <i>Metarhizium</i>'s acaricidal effects, which impacted mite activity. Furthermore, mite feeding on fungus-infected insect cadavers caused visible damage to the integument. This feeding behavior significantly impacted fungal sporulation, a key factor in biocontrol efficacy. In both the <i>M. brunneum</i>-infected <i>Galleria</i> groups and the <i>Tr. afroharzianum</i>-infected <i>Galleria</i> groups, mite numbers increased over time, peaking around 9–11 days post-infection before slightly declining or plateauing. Notably, the fungi-infected insect tissue consistently exhibited significantly higher mite numbers than the pure cultures group at several time points. In dual-culture assays, <i>Sa. polyphyllae</i> mites preferentially fed on <i>Fusarium oxysporum</i> over <i>Tr. afroharzianum</i>. The presence of <i>Fusarium</i> may influence mite behavior and potentially reduce their impact on <i>Trichoderma</i>. This preference, possibly nutritional, requires further investigation. Consequently, <i>Trichoderma</i>'s suppression of <i>Fusarium</i> in soil could significantly impact the food resources available to soil-dwelling mites like <i>Sa. polyphyllae</i>. Further research is needed to determine the nutritional basis of this feeding preference.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70469","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145695545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In North America, forest ecosystems have changed drastically since European settlement due to logging, land-use changes, and altered disturbance regimes. For example, red and white pine stands declined significantly in the last three centuries, and this decline was attributed to their extensive harvesting during settlement. Human-induced changes in fire regime is another probable cause of pine forests' decline that has gained attention in the last decades. However, the study of red and white pine forests can be challenging, because few pre-settlement pine forests remain today, as they were extensively harvested during the 19th century. During this extensive exploitation of pine forests, logs were transported via log driving, and many of them sunk to the bottom of lakes. These sinker logs represent an opportunity to study pre-settlement pine forests and their natural disturbance regimes. The aim of this research was to reconstruct fire regimes from the pre-settlement period to late 20th century (1700–1970) in eastern Canadian pine forests. To achieve this goal, 1151 submerged logs were extracted from lakes in the Témiscamingue region (Québec), 60 of which exhibited fire scars. We built a reference chronology using 140 living pines to cross-date 81 scars and were able to reconstruct fire activity since 1717. We then modeled the relative probability of fire occurrence across settlement periods using a Bayesian approach. Our results showed that the probability of fire occurrence almost doubled following the beginning of settlement (1840), highlighting the impact of intensified logging and land conversion on fire frequency. Our study is among the first to use sinker logs and a Bayesian approach to reconstruct and model preindustrial fire regimes in pine forests. This new knowledge is crucial to develop sustainable forest management practices and conservation strategies in red and white pine forests in North America.
{"title":"Keeping logs on the past: Log driving tells the story of fire regimes in pine forests of eastern Canada","authors":"Julie-Pascale Labrecque-Foy, Marc-André Lemay, Fabio Gennaretti, Dominique Arseneault, Miguel Montoro Girona","doi":"10.1002/ecs2.70473","DOIUrl":"https://doi.org/10.1002/ecs2.70473","url":null,"abstract":"<p>In North America, forest ecosystems have changed drastically since European settlement due to logging, land-use changes, and altered disturbance regimes. For example, red and white pine stands declined significantly in the last three centuries, and this decline was attributed to their extensive harvesting during settlement. Human-induced changes in fire regime is another probable cause of pine forests' decline that has gained attention in the last decades. However, the study of red and white pine forests can be challenging, because few pre-settlement pine forests remain today, as they were extensively harvested during the 19th century. During this extensive exploitation of pine forests, logs were transported via log driving, and many of them sunk to the bottom of lakes. These sinker logs represent an opportunity to study pre-settlement pine forests and their natural disturbance regimes. The aim of this research was to reconstruct fire regimes from the pre-settlement period to late 20th century (1700–1970) in eastern Canadian pine forests. To achieve this goal, 1151 submerged logs were extracted from lakes in the Témiscamingue region (Québec), 60 of which exhibited fire scars. We built a reference chronology using 140 living pines to cross-date 81 scars and were able to reconstruct fire activity since 1717. We then modeled the relative probability of fire occurrence across settlement periods using a Bayesian approach. Our results showed that the probability of fire occurrence almost doubled following the beginning of settlement (1840), highlighting the impact of intensified logging and land conversion on fire frequency. Our study is among the first to use sinker logs and a Bayesian approach to reconstruct and model preindustrial fire regimes in pine forests. This new knowledge is crucial to develop sustainable forest management practices and conservation strategies in red and white pine forests in North America.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70473","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145686482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Steven N. Winter, Glen A. Sargeant, Margaret A. Wild, Erin Clancey, Kathryn P. Huyvaert, Kyle Garrison, Pilar Fernandez
Environments can shape the occurrence and extent of disease outbreaks in wildlife. We studied the effects of environmental features on the occurrence of treponeme-associated hoof disease (TAHD), an emerging infectious disease of free-ranging elk (Cervus canadensis), in southwestern Washington, USA. During the 2016–2022 harvest seasons, successful elk hunters returned mandatory harvest reports and noted the presence or absence of hoof abnormalities indicative of TAHD. We used generalized linear models and an information-theoretic approach to model selection to relate (1) the spatial distribution of hoof abnormalities to features of landscapes (land cover, topography, and soil characteristics) and (2) the temporal distribution of hoof abnormalities to precipitation during the year preceding the harvest season. The probability of hoof disease increased with soil clay content and proportion of agricultural land (88% of model weight). We found no conclusive evidence for an effect of precipitation on the occurrence of TAHD, but this could relate to relatively high annual precipitation (>140 cm) in the study area. Nevertheless, disease cases may have been negatively associated with precipitation during February–June (55% of model weight). Soils and land management practices may increase the risk of hoof disease by promoting the survival of pathogens that cause TAHD, the susceptibility of elk to infection, or the intensity of pathogen transmission among elk when congregated. Focusing on areas where the risk of disease is greatest may facilitate the detection of TAHD during surveillance. Likewise, removing infected elk and dispersing uninfected elk from areas with the greatest risk of disease may enhance the effectiveness of efforts to reduce transmission. Basing this work on the knowledge that disease risk is modified by factors of hosts, pathogens, and environments, this study serves as an application of the epidemiological triad framework to better understand the ecology and epidemiology of an emerging infectious disease in wildlife.
{"title":"Land use and soil characteristics are associated with increased risk of treponeme-associated hoof disease in elk","authors":"Steven N. Winter, Glen A. Sargeant, Margaret A. Wild, Erin Clancey, Kathryn P. Huyvaert, Kyle Garrison, Pilar Fernandez","doi":"10.1002/ecs2.70470","DOIUrl":"https://doi.org/10.1002/ecs2.70470","url":null,"abstract":"<p>Environments can shape the occurrence and extent of disease outbreaks in wildlife. We studied the effects of environmental features on the occurrence of treponeme-associated hoof disease (TAHD), an emerging infectious disease of free-ranging elk (<i>Cervus canadensis</i>), in southwestern Washington, USA. During the 2016–2022 harvest seasons, successful elk hunters returned mandatory harvest reports and noted the presence or absence of hoof abnormalities indicative of TAHD. We used generalized linear models and an information-theoretic approach to model selection to relate (1) the spatial distribution of hoof abnormalities to features of landscapes (land cover, topography, and soil characteristics) and (2) the temporal distribution of hoof abnormalities to precipitation during the year preceding the harvest season. The probability of hoof disease increased with soil clay content and proportion of agricultural land (88% of model weight). We found no conclusive evidence for an effect of precipitation on the occurrence of TAHD, but this could relate to relatively high annual precipitation (>140 cm) in the study area. Nevertheless, disease cases may have been negatively associated with precipitation during February–June (55% of model weight). Soils and land management practices may increase the risk of hoof disease by promoting the survival of pathogens that cause TAHD, the susceptibility of elk to infection, or the intensity of pathogen transmission among elk when congregated. Focusing on areas where the risk of disease is greatest may facilitate the detection of TAHD during surveillance. Likewise, removing infected elk and dispersing uninfected elk from areas with the greatest risk of disease may enhance the effectiveness of efforts to reduce transmission. Basing this work on the knowledge that disease risk is modified by factors of hosts, pathogens, and environments, this study serves as an application of the epidemiological triad framework to better understand the ecology and epidemiology of an emerging infectious disease in wildlife.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70470","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145686127","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Uta Müller, Katherine Borchardt, Anna Britzman, Neal M. Williams
Current native bee declines have been attributed in part to loss of habitat and floral resources. Mitigation approaches include trying to establish or enhance bee habitat by planting wildflowers, which is often done by using seed mixes. Most assessments of plant performance in pollinator seed mixes are based on the abundance and diversity of visitors they support, which might be key elements of community interactions such as supporting specialized links within plant–pollinator networks or changes in core position within networks across seasons. Therefore, the selection of “candidate key pollinator resources” could be usefully extended to include assessment of species network roles. In two different experiments, we independently assessed the performance of a set of 28 native California wildflowers, first in seed mix applications over three years and second in standardized mono-specific plantings of the same plant species. Within seed mixes, plant species' performance according to maximum floral area and phenological coverage clearly differed with certain taxa found to dominate, or only establish short term or not be competitive at all. Only a small set of 5 species persisted with high performance over the whole three years. Based on mono-specific plot studies quantifying species network roles, as opposed to simply abundance-diversity metrics, we identified network core species and species supporting ecologically specialized pollinators. Both species roles are characteristic in natural plant–pollinator networks and therefore could present key characteristics for choosing plants with the aim of restoring mutualistic plant–pollinator networks with wildflower habitat plantings. Plant species' network roles changed over the seasons within a year arguing for the inclusion of seasonality when choosing candidate key pollinator resources for plant mixes. Only a subset of species in networks proved to be successful in seed mix applications where plants needed to perform under competition. The results emphasize the need to evaluate wildflower plantings and prioritize species of high performance for multiple criteria in current and future applications.
{"title":"Seed mix performance and species network roles as a framework to select candidate key resources for pollinator habitat","authors":"Uta Müller, Katherine Borchardt, Anna Britzman, Neal M. Williams","doi":"10.1002/ecs2.70492","DOIUrl":"https://doi.org/10.1002/ecs2.70492","url":null,"abstract":"<p>Current native bee declines have been attributed in part to loss of habitat and floral resources. Mitigation approaches include trying to establish or enhance bee habitat by planting wildflowers, which is often done by using seed mixes. Most assessments of plant performance in pollinator seed mixes are based on the abundance and diversity of visitors they support, which might be key elements of community interactions such as supporting specialized links within plant–pollinator networks or changes in core position within networks across seasons. Therefore, the selection of “candidate key pollinator resources” could be usefully extended to include assessment of species network roles. In two different experiments, we independently assessed the performance of a set of 28 native California wildflowers, first in seed mix applications over three years and second in standardized mono-specific plantings of the same plant species. Within seed mixes, plant species' performance according to maximum floral area and phenological coverage clearly differed with certain taxa found to dominate, or only establish short term or not be competitive at all. Only a small set of 5 species persisted with high performance over the whole three years. Based on mono-specific plot studies quantifying species network roles, as opposed to simply abundance-diversity metrics, we identified network core species and species supporting ecologically specialized pollinators. Both species roles are characteristic in natural plant–pollinator networks and therefore could present key characteristics for choosing plants with the aim of restoring mutualistic plant–pollinator networks with wildflower habitat plantings. Plant species' network roles changed over the seasons within a year arguing for the inclusion of seasonality when choosing candidate key pollinator resources for plant mixes. Only a subset of species in networks proved to be successful in seed mix applications where plants needed to perform under competition. The results emphasize the need to evaluate wildflower plantings and prioritize species of high performance for multiple criteria in current and future applications.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70492","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145686277","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jenna E. Morris, Madison M. Laughlin, Liliana K. Rangel-Parra, Daniel C. Donato, Joshua S. Halofsky, David E. Butman, Brian J. Harvey
Biological legacies (i.e., materials that persist following disturbance; “legacies”) shape ecosystem functioning and feedbacks to future disturbances, yet how legacies are driven by pre-disturbance ecosystem state and disturbance severity is poorly understood—especially in ecosystems influenced by infrequent and severe disturbances. Focusing on wet temperate forests as an archetype of these ecosystems, we characterized live and dead aboveground biomass 2–5 years post-fire in western Washington and northwestern Oregon, USA, to ask: How do pre-fire stand age (i.e., pre-disturbance ecosystem state) and burn severity drive variability in initial post-fire legacies, specifically (1) aboveground biomass carbon and (2) fuel profiles? Dominant drivers of post-fire legacies varied by response variable, with pre-disturbance ecosystem state driving total legacy amounts and disturbance severity moderating legacy condition. Total post-fire carbon was ~3–4 times greater in mid- and late-seral stands compared to young stands. In unburned and low-severity fire stands, >70% of post-fire total carbon was live, and canopy fuel profiles were largely indistinguishable, suggesting greater continuity of structure and function following low-severity fire. Conversely, in high-severity stands, >95% of post-fire total carbon was dead and sparse canopy fuel remained. Regardless of burn severity, most biomass present pre-fire persisted following fire, suggesting high-carbon pre-fire stands lead to high-carbon post-fire stands (and vice versa). Persistence of legacy biomass in high-severity stands, even as it decays, will therefore buffer total ecosystem carbon storage as live carbon recovers over time. Further, all burned stands had considerable production of black carbon in charred wood biomass which can support ecosystem functioning and promote long-term carbon storage. Initial post-fire fuel profiles are likely sufficient to support fire in all stands, but reburn potential may be greater in high-severity stands due to rapid regeneration of flammable live surface vegetation and more exposed microclimatic conditions. Effects of fuel reduction from fire on mediating the occurrence and potential behavior of subsequent fires in high-productivity systems therefore appear short-lived. Our findings demonstrate the importance of pre-disturbance ecosystem state in dictating many aspects of initial post-disturbance structure and function, with important implications for managing post-fire recovery trajectories in some of Earth's most productive and high-biomass forests.
{"title":"Pre-fire structure drives variability in post-fire aboveground carbon and fuel profiles in wet temperate forests","authors":"Jenna E. Morris, Madison M. Laughlin, Liliana K. Rangel-Parra, Daniel C. Donato, Joshua S. Halofsky, David E. Butman, Brian J. Harvey","doi":"10.1002/ecs2.70479","DOIUrl":"https://doi.org/10.1002/ecs2.70479","url":null,"abstract":"<p>Biological legacies (i.e., materials that persist following disturbance; “legacies”) shape ecosystem functioning and feedbacks to future disturbances, yet how legacies are driven by pre-disturbance ecosystem state and disturbance severity is poorly understood—especially in ecosystems influenced by infrequent and severe disturbances. Focusing on wet temperate forests as an archetype of these ecosystems, we characterized live and dead aboveground biomass 2–5 years post-fire in western Washington and northwestern Oregon, USA, to ask: How do pre-fire stand age (i.e., pre-disturbance ecosystem state) and burn severity drive variability in initial post-fire legacies, specifically (1) aboveground biomass carbon and (2) fuel profiles? Dominant drivers of post-fire legacies varied by response variable, with pre-disturbance ecosystem state driving total legacy amounts and disturbance severity moderating legacy condition. Total post-fire carbon was ~3–4 times greater in mid- and late-seral stands compared to young stands. In unburned and low-severity fire stands, >70% of post-fire total carbon was live, and canopy fuel profiles were largely indistinguishable, suggesting greater continuity of structure and function following low-severity fire. Conversely, in high-severity stands, >95% of post-fire total carbon was dead and sparse canopy fuel remained. Regardless of burn severity, most biomass present pre-fire persisted following fire, suggesting high-carbon pre-fire stands lead to high-carbon post-fire stands (and vice versa). Persistence of legacy biomass in high-severity stands, even as it decays, will therefore buffer total ecosystem carbon storage as live carbon recovers over time. Further, all burned stands had considerable production of black carbon in charred wood biomass which can support ecosystem functioning and promote long-term carbon storage. Initial post-fire fuel profiles are likely sufficient to support fire in all stands, but reburn potential may be greater in high-severity stands due to rapid regeneration of flammable live surface vegetation and more exposed microclimatic conditions. Effects of fuel reduction from fire on mediating the occurrence and potential behavior of subsequent fires in high-productivity systems therefore appear short-lived. Our findings demonstrate the importance of pre-disturbance ecosystem state in dictating many aspects of initial post-disturbance structure and function, with important implications for managing post-fire recovery trajectories in some of Earth's most productive and high-biomass forests.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70479","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145652544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Some animals use stingers to repel attackers, and some predators have evolved tolerance to such stings, enabling them to consume venomous prey. For example, social wasps, such as hornets, use modified ovipositors as venomous stingers to inject venom, which can cause intense pain in humans. The world's largest hornet, Vespa mandarinia (Hymenoptera: Vespidae), stores huge amounts of venom in its abdomen, which can kill mammals. Although some animals are known to prey on adult hornets, it remains unclear whether these predators can avoid or tolerate their venomous stings. Adult hornets have been found in the stomach contents of some amphibian predators, including the pond frog Pelophylax nigromaculatus (Anura: Ranidae), suggesting that they can successfully attack and consume hornets. To examine whether frogs avoid or tolerate hornet stings, the pond frog P. nigromaculatus was experimentally presented with stinging females (workers) of three Japanese hornet (Vespa) species—V. simillima, V. analis, and V. mandarinia—under laboratory conditions. Almost all frogs attacked the hornets, and the hornets were observed stinging the frogs during these attacks. However, 93%, 87%, and 79% of the frogs ultimately consumed V. simillima, V. analis, and V. mandarinia, respectively. Hornet stings neither killed nor harmed the frogs. These results suggest a high tolerance of pond frogs to the venomous and painful stings of giant hornets. Frogs may serve as useful model organisms for investigating the physiological mechanisms underlying the intense pain and lethal effects of hornet stings in vertebrates.
{"title":"Pond frog as a predator of hornet workers: High tolerance to venomous stings","authors":"Shinji Sugiura","doi":"10.1002/ecs2.70457","DOIUrl":"https://doi.org/10.1002/ecs2.70457","url":null,"abstract":"<p>Some animals use stingers to repel attackers, and some predators have evolved tolerance to such stings, enabling them to consume venomous prey. For example, social wasps, such as hornets, use modified ovipositors as venomous stingers to inject venom, which can cause intense pain in humans. The world's largest hornet, <i>Vespa mandarinia</i> (Hymenoptera: Vespidae), stores huge amounts of venom in its abdomen, which can kill mammals. Although some animals are known to prey on adult hornets, it remains unclear whether these predators can avoid or tolerate their venomous stings. Adult hornets have been found in the stomach contents of some amphibian predators, including the pond frog <i>Pelophylax nigromaculatus</i> (Anura: Ranidae), suggesting that they can successfully attack and consume hornets. To examine whether frogs avoid or tolerate hornet stings, the pond frog <i>P</i>. <i>nigromaculatus</i> was experimentally presented with stinging females (workers) of three Japanese hornet (<i>Vespa</i>) species—<i>V</i>. <i>simillima</i>, <i>V</i>. <i>analis</i>, and <i>V</i>. <i>mandarinia</i>—under laboratory conditions. Almost all frogs attacked the hornets, and the hornets were observed stinging the frogs during these attacks. However, 93%, 87%, and 79% of the frogs ultimately consumed <i>V</i>. <i>simillima</i>, <i>V</i>. <i>analis</i>, and <i>V</i>. <i>mandarinia</i>, respectively. Hornet stings neither killed nor harmed the frogs. These results suggest a high tolerance of pond frogs to the venomous and painful stings of giant hornets. Frogs may serve as useful model organisms for investigating the physiological mechanisms underlying the intense pain and lethal effects of hornet stings in vertebrates.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70457","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145695134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hilde Karine Wam, Annika M. Felton, Adam Felton, Robert Spitzer, Märtha Wallgren
Cervid (Cervidae) populations that are overabundant with respect to their food resources are expected to show declining physiological and reproductive fitness. A proactive solution to such declines is to integrate the monitoring of food resources with animal harvesting strategies, but there are few studies available to guide managers regarding which food resources to monitor and how to do so. In this study, we used a large, rare data set that included detailed absolute measures of available food quantities and browsing intensity from field inventories, to test their relationship with fitness indices of moose Alces alces in 24 management units in four regions across Norway. We found that calf body mass and calves seen per cow during the autumn hunt were strongly and positively related to the availability of tree forage, especially the species most selected for by the study moose (e.g., rowan [Sorbus aucuparia] and sallow [Salix caprea]). The strength of the correlations varied between regions, apparently being stronger where the moose were closer to being overabundant or had a legacy of past overabundance. As expected, the intensity of browsing on the three most common tree species, that is, birch (Betula spp.), rowan, and pine (Pinus sylvestris), was also negatively and strongly related to the fitness. We discuss how our approach to food monitoring can facilitate a management that proactively adjusts densities of moose, and possibly other cervids, to trends in food availability and browsing intensity, thereby avoiding detrimental effects of overabundance.
{"title":"Food for fitness? Insights from 24 Norwegian moose populations for proactive monitoring and preventing overabundance","authors":"Hilde Karine Wam, Annika M. Felton, Adam Felton, Robert Spitzer, Märtha Wallgren","doi":"10.1002/ecs2.70476","DOIUrl":"https://doi.org/10.1002/ecs2.70476","url":null,"abstract":"<p>Cervid (Cervidae) populations that are overabundant with respect to their food resources are expected to show declining physiological and reproductive fitness. A proactive solution to such declines is to integrate the monitoring of food resources with animal harvesting strategies, but there are few studies available to guide managers regarding which food resources to monitor and how to do so. In this study, we used a large, rare data set that included detailed absolute measures of available food quantities and browsing intensity from field inventories, to test their relationship with fitness indices of moose <i>Alces alces</i> in 24 management units in four regions across Norway. We found that calf body mass and calves seen per cow during the autumn hunt were strongly and positively related to the availability of tree forage, especially the species most selected for by the study moose (e.g., rowan [<i>Sorbus aucuparia</i>] and sallow [<i>Salix caprea</i>]). The strength of the correlations varied between regions, apparently being stronger where the moose were closer to being overabundant or had a legacy of past overabundance. As expected, the intensity of browsing on the three most common tree species, that is, birch (<i>Betula</i> spp.), rowan, and pine (<i>Pinus sylvestris</i>), was also negatively and strongly related to the fitness. We discuss how our approach to food monitoring can facilitate a management that proactively adjusts densities of moose, and possibly other cervids, to trends in food availability and browsing intensity, thereby avoiding detrimental effects of overabundance.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70476","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145686275","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}