Mustapha Touray, Harun Cimen, Ibrahim Cakmak, Selcuk Hazir
Mycophagous invertebrates can significantly impact the efficacy of fungal biocontrol agents; yet the interaction between these agents and Sancassania polyphyllae (Acari: Acaridae), commonly found in soil ecosystems, remains poorly understood. Our study demonstrates that Sa. polyphyllae mites feed on fungus-infected insect cadavers as well as the mycelium and spores of Trichoderma afroharzianum and Metarhizium brunneum in pure cultures. Mite feeding activity was greater on Trichoderma than Metarhizium pure cultures, possibly due to Metarhizium's acaricidal effects, which impacted mite activity. Furthermore, mite feeding on fungus-infected insect cadavers caused visible damage to the integument. This feeding behavior significantly impacted fungal sporulation, a key factor in biocontrol efficacy. In both the M. brunneum-infected Galleria groups and the Tr. afroharzianum-infected Galleria groups, mite numbers increased over time, peaking around 9–11 days post-infection before slightly declining or plateauing. Notably, the fungi-infected insect tissue consistently exhibited significantly higher mite numbers than the pure cultures group at several time points. In dual-culture assays, Sa. polyphyllae mites preferentially fed on Fusarium oxysporum over Tr. afroharzianum. The presence of Fusarium may influence mite behavior and potentially reduce their impact on Trichoderma. This preference, possibly nutritional, requires further investigation. Consequently, Trichoderma's suppression of Fusarium in soil could significantly impact the food resources available to soil-dwelling mites like Sa. polyphyllae. Further research is needed to determine the nutritional basis of this feeding preference.
{"title":"Tri-trophic interactions of soil mite Sancassania polyphyllae (Acari: Acaridae) with fungal biocontrol agents","authors":"Mustapha Touray, Harun Cimen, Ibrahim Cakmak, Selcuk Hazir","doi":"10.1002/ecs2.70469","DOIUrl":"https://doi.org/10.1002/ecs2.70469","url":null,"abstract":"<p>Mycophagous invertebrates can significantly impact the efficacy of fungal biocontrol agents; yet the interaction between these agents and <i>Sancassania polyphyllae</i> (Acari: Acaridae), commonly found in soil ecosystems, remains poorly understood. Our study demonstrates that <i>Sa. polyphyllae</i> mites feed on fungus-infected insect cadavers as well as the mycelium and spores of <i>Trichoderma afroharzianum</i> and <i>Metarhizium brunneum</i> in pure cultures. Mite feeding activity was greater on <i>Trichoderma</i> than <i>Metarhizium</i> pure cultures, possibly due to <i>Metarhizium</i>'s acaricidal effects, which impacted mite activity. Furthermore, mite feeding on fungus-infected insect cadavers caused visible damage to the integument. This feeding behavior significantly impacted fungal sporulation, a key factor in biocontrol efficacy. In both the <i>M. brunneum</i>-infected <i>Galleria</i> groups and the <i>Tr. afroharzianum</i>-infected <i>Galleria</i> groups, mite numbers increased over time, peaking around 9–11 days post-infection before slightly declining or plateauing. Notably, the fungi-infected insect tissue consistently exhibited significantly higher mite numbers than the pure cultures group at several time points. In dual-culture assays, <i>Sa. polyphyllae</i> mites preferentially fed on <i>Fusarium oxysporum</i> over <i>Tr. afroharzianum</i>. The presence of <i>Fusarium</i> may influence mite behavior and potentially reduce their impact on <i>Trichoderma</i>. This preference, possibly nutritional, requires further investigation. Consequently, <i>Trichoderma</i>'s suppression of <i>Fusarium</i> in soil could significantly impact the food resources available to soil-dwelling mites like <i>Sa. polyphyllae</i>. Further research is needed to determine the nutritional basis of this feeding preference.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70469","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145695545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In North America, forest ecosystems have changed drastically since European settlement due to logging, land-use changes, and altered disturbance regimes. For example, red and white pine stands declined significantly in the last three centuries, and this decline was attributed to their extensive harvesting during settlement. Human-induced changes in fire regime is another probable cause of pine forests' decline that has gained attention in the last decades. However, the study of red and white pine forests can be challenging, because few pre-settlement pine forests remain today, as they were extensively harvested during the 19th century. During this extensive exploitation of pine forests, logs were transported via log driving, and many of them sunk to the bottom of lakes. These sinker logs represent an opportunity to study pre-settlement pine forests and their natural disturbance regimes. The aim of this research was to reconstruct fire regimes from the pre-settlement period to late 20th century (1700–1970) in eastern Canadian pine forests. To achieve this goal, 1151 submerged logs were extracted from lakes in the Témiscamingue region (Québec), 60 of which exhibited fire scars. We built a reference chronology using 140 living pines to cross-date 81 scars and were able to reconstruct fire activity since 1717. We then modeled the relative probability of fire occurrence across settlement periods using a Bayesian approach. Our results showed that the probability of fire occurrence almost doubled following the beginning of settlement (1840), highlighting the impact of intensified logging and land conversion on fire frequency. Our study is among the first to use sinker logs and a Bayesian approach to reconstruct and model preindustrial fire regimes in pine forests. This new knowledge is crucial to develop sustainable forest management practices and conservation strategies in red and white pine forests in North America.
{"title":"Keeping logs on the past: Log driving tells the story of fire regimes in pine forests of eastern Canada","authors":"Julie-Pascale Labrecque-Foy, Marc-André Lemay, Fabio Gennaretti, Dominique Arseneault, Miguel Montoro Girona","doi":"10.1002/ecs2.70473","DOIUrl":"https://doi.org/10.1002/ecs2.70473","url":null,"abstract":"<p>In North America, forest ecosystems have changed drastically since European settlement due to logging, land-use changes, and altered disturbance regimes. For example, red and white pine stands declined significantly in the last three centuries, and this decline was attributed to their extensive harvesting during settlement. Human-induced changes in fire regime is another probable cause of pine forests' decline that has gained attention in the last decades. However, the study of red and white pine forests can be challenging, because few pre-settlement pine forests remain today, as they were extensively harvested during the 19th century. During this extensive exploitation of pine forests, logs were transported via log driving, and many of them sunk to the bottom of lakes. These sinker logs represent an opportunity to study pre-settlement pine forests and their natural disturbance regimes. The aim of this research was to reconstruct fire regimes from the pre-settlement period to late 20th century (1700–1970) in eastern Canadian pine forests. To achieve this goal, 1151 submerged logs were extracted from lakes in the Témiscamingue region (Québec), 60 of which exhibited fire scars. We built a reference chronology using 140 living pines to cross-date 81 scars and were able to reconstruct fire activity since 1717. We then modeled the relative probability of fire occurrence across settlement periods using a Bayesian approach. Our results showed that the probability of fire occurrence almost doubled following the beginning of settlement (1840), highlighting the impact of intensified logging and land conversion on fire frequency. Our study is among the first to use sinker logs and a Bayesian approach to reconstruct and model preindustrial fire regimes in pine forests. This new knowledge is crucial to develop sustainable forest management practices and conservation strategies in red and white pine forests in North America.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70473","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145686482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Steven N. Winter, Glen A. Sargeant, Margaret A. Wild, Erin Clancey, Kathryn P. Huyvaert, Kyle Garrison, Pilar Fernandez
Environments can shape the occurrence and extent of disease outbreaks in wildlife. We studied the effects of environmental features on the occurrence of treponeme-associated hoof disease (TAHD), an emerging infectious disease of free-ranging elk (Cervus canadensis), in southwestern Washington, USA. During the 2016–2022 harvest seasons, successful elk hunters returned mandatory harvest reports and noted the presence or absence of hoof abnormalities indicative of TAHD. We used generalized linear models and an information-theoretic approach to model selection to relate (1) the spatial distribution of hoof abnormalities to features of landscapes (land cover, topography, and soil characteristics) and (2) the temporal distribution of hoof abnormalities to precipitation during the year preceding the harvest season. The probability of hoof disease increased with soil clay content and proportion of agricultural land (88% of model weight). We found no conclusive evidence for an effect of precipitation on the occurrence of TAHD, but this could relate to relatively high annual precipitation (>140 cm) in the study area. Nevertheless, disease cases may have been negatively associated with precipitation during February–June (55% of model weight). Soils and land management practices may increase the risk of hoof disease by promoting the survival of pathogens that cause TAHD, the susceptibility of elk to infection, or the intensity of pathogen transmission among elk when congregated. Focusing on areas where the risk of disease is greatest may facilitate the detection of TAHD during surveillance. Likewise, removing infected elk and dispersing uninfected elk from areas with the greatest risk of disease may enhance the effectiveness of efforts to reduce transmission. Basing this work on the knowledge that disease risk is modified by factors of hosts, pathogens, and environments, this study serves as an application of the epidemiological triad framework to better understand the ecology and epidemiology of an emerging infectious disease in wildlife.
{"title":"Land use and soil characteristics are associated with increased risk of treponeme-associated hoof disease in elk","authors":"Steven N. Winter, Glen A. Sargeant, Margaret A. Wild, Erin Clancey, Kathryn P. Huyvaert, Kyle Garrison, Pilar Fernandez","doi":"10.1002/ecs2.70470","DOIUrl":"https://doi.org/10.1002/ecs2.70470","url":null,"abstract":"<p>Environments can shape the occurrence and extent of disease outbreaks in wildlife. We studied the effects of environmental features on the occurrence of treponeme-associated hoof disease (TAHD), an emerging infectious disease of free-ranging elk (<i>Cervus canadensis</i>), in southwestern Washington, USA. During the 2016–2022 harvest seasons, successful elk hunters returned mandatory harvest reports and noted the presence or absence of hoof abnormalities indicative of TAHD. We used generalized linear models and an information-theoretic approach to model selection to relate (1) the spatial distribution of hoof abnormalities to features of landscapes (land cover, topography, and soil characteristics) and (2) the temporal distribution of hoof abnormalities to precipitation during the year preceding the harvest season. The probability of hoof disease increased with soil clay content and proportion of agricultural land (88% of model weight). We found no conclusive evidence for an effect of precipitation on the occurrence of TAHD, but this could relate to relatively high annual precipitation (>140 cm) in the study area. Nevertheless, disease cases may have been negatively associated with precipitation during February–June (55% of model weight). Soils and land management practices may increase the risk of hoof disease by promoting the survival of pathogens that cause TAHD, the susceptibility of elk to infection, or the intensity of pathogen transmission among elk when congregated. Focusing on areas where the risk of disease is greatest may facilitate the detection of TAHD during surveillance. Likewise, removing infected elk and dispersing uninfected elk from areas with the greatest risk of disease may enhance the effectiveness of efforts to reduce transmission. Basing this work on the knowledge that disease risk is modified by factors of hosts, pathogens, and environments, this study serves as an application of the epidemiological triad framework to better understand the ecology and epidemiology of an emerging infectious disease in wildlife.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70470","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145686127","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Uta Müller, Katherine Borchardt, Anna Britzman, Neal M. Williams
Current native bee declines have been attributed in part to loss of habitat and floral resources. Mitigation approaches include trying to establish or enhance bee habitat by planting wildflowers, which is often done by using seed mixes. Most assessments of plant performance in pollinator seed mixes are based on the abundance and diversity of visitors they support, which might be key elements of community interactions such as supporting specialized links within plant–pollinator networks or changes in core position within networks across seasons. Therefore, the selection of “candidate key pollinator resources” could be usefully extended to include assessment of species network roles. In two different experiments, we independently assessed the performance of a set of 28 native California wildflowers, first in seed mix applications over three years and second in standardized mono-specific plantings of the same plant species. Within seed mixes, plant species' performance according to maximum floral area and phenological coverage clearly differed with certain taxa found to dominate, or only establish short term or not be competitive at all. Only a small set of 5 species persisted with high performance over the whole three years. Based on mono-specific plot studies quantifying species network roles, as opposed to simply abundance-diversity metrics, we identified network core species and species supporting ecologically specialized pollinators. Both species roles are characteristic in natural plant–pollinator networks and therefore could present key characteristics for choosing plants with the aim of restoring mutualistic plant–pollinator networks with wildflower habitat plantings. Plant species' network roles changed over the seasons within a year arguing for the inclusion of seasonality when choosing candidate key pollinator resources for plant mixes. Only a subset of species in networks proved to be successful in seed mix applications where plants needed to perform under competition. The results emphasize the need to evaluate wildflower plantings and prioritize species of high performance for multiple criteria in current and future applications.
{"title":"Seed mix performance and species network roles as a framework to select candidate key resources for pollinator habitat","authors":"Uta Müller, Katherine Borchardt, Anna Britzman, Neal M. Williams","doi":"10.1002/ecs2.70492","DOIUrl":"https://doi.org/10.1002/ecs2.70492","url":null,"abstract":"<p>Current native bee declines have been attributed in part to loss of habitat and floral resources. Mitigation approaches include trying to establish or enhance bee habitat by planting wildflowers, which is often done by using seed mixes. Most assessments of plant performance in pollinator seed mixes are based on the abundance and diversity of visitors they support, which might be key elements of community interactions such as supporting specialized links within plant–pollinator networks or changes in core position within networks across seasons. Therefore, the selection of “candidate key pollinator resources” could be usefully extended to include assessment of species network roles. In two different experiments, we independently assessed the performance of a set of 28 native California wildflowers, first in seed mix applications over three years and second in standardized mono-specific plantings of the same plant species. Within seed mixes, plant species' performance according to maximum floral area and phenological coverage clearly differed with certain taxa found to dominate, or only establish short term or not be competitive at all. Only a small set of 5 species persisted with high performance over the whole three years. Based on mono-specific plot studies quantifying species network roles, as opposed to simply abundance-diversity metrics, we identified network core species and species supporting ecologically specialized pollinators. Both species roles are characteristic in natural plant–pollinator networks and therefore could present key characteristics for choosing plants with the aim of restoring mutualistic plant–pollinator networks with wildflower habitat plantings. Plant species' network roles changed over the seasons within a year arguing for the inclusion of seasonality when choosing candidate key pollinator resources for plant mixes. Only a subset of species in networks proved to be successful in seed mix applications where plants needed to perform under competition. The results emphasize the need to evaluate wildflower plantings and prioritize species of high performance for multiple criteria in current and future applications.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70492","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145686277","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jenna E. Morris, Madison M. Laughlin, Liliana K. Rangel-Parra, Daniel C. Donato, Joshua S. Halofsky, David E. Butman, Brian J. Harvey
Biological legacies (i.e., materials that persist following disturbance; “legacies”) shape ecosystem functioning and feedbacks to future disturbances, yet how legacies are driven by pre-disturbance ecosystem state and disturbance severity is poorly understood—especially in ecosystems influenced by infrequent and severe disturbances. Focusing on wet temperate forests as an archetype of these ecosystems, we characterized live and dead aboveground biomass 2–5 years post-fire in western Washington and northwestern Oregon, USA, to ask: How do pre-fire stand age (i.e., pre-disturbance ecosystem state) and burn severity drive variability in initial post-fire legacies, specifically (1) aboveground biomass carbon and (2) fuel profiles? Dominant drivers of post-fire legacies varied by response variable, with pre-disturbance ecosystem state driving total legacy amounts and disturbance severity moderating legacy condition. Total post-fire carbon was ~3–4 times greater in mid- and late-seral stands compared to young stands. In unburned and low-severity fire stands, >70% of post-fire total carbon was live, and canopy fuel profiles were largely indistinguishable, suggesting greater continuity of structure and function following low-severity fire. Conversely, in high-severity stands, >95% of post-fire total carbon was dead and sparse canopy fuel remained. Regardless of burn severity, most biomass present pre-fire persisted following fire, suggesting high-carbon pre-fire stands lead to high-carbon post-fire stands (and vice versa). Persistence of legacy biomass in high-severity stands, even as it decays, will therefore buffer total ecosystem carbon storage as live carbon recovers over time. Further, all burned stands had considerable production of black carbon in charred wood biomass which can support ecosystem functioning and promote long-term carbon storage. Initial post-fire fuel profiles are likely sufficient to support fire in all stands, but reburn potential may be greater in high-severity stands due to rapid regeneration of flammable live surface vegetation and more exposed microclimatic conditions. Effects of fuel reduction from fire on mediating the occurrence and potential behavior of subsequent fires in high-productivity systems therefore appear short-lived. Our findings demonstrate the importance of pre-disturbance ecosystem state in dictating many aspects of initial post-disturbance structure and function, with important implications for managing post-fire recovery trajectories in some of Earth's most productive and high-biomass forests.
{"title":"Pre-fire structure drives variability in post-fire aboveground carbon and fuel profiles in wet temperate forests","authors":"Jenna E. Morris, Madison M. Laughlin, Liliana K. Rangel-Parra, Daniel C. Donato, Joshua S. Halofsky, David E. Butman, Brian J. Harvey","doi":"10.1002/ecs2.70479","DOIUrl":"https://doi.org/10.1002/ecs2.70479","url":null,"abstract":"<p>Biological legacies (i.e., materials that persist following disturbance; “legacies”) shape ecosystem functioning and feedbacks to future disturbances, yet how legacies are driven by pre-disturbance ecosystem state and disturbance severity is poorly understood—especially in ecosystems influenced by infrequent and severe disturbances. Focusing on wet temperate forests as an archetype of these ecosystems, we characterized live and dead aboveground biomass 2–5 years post-fire in western Washington and northwestern Oregon, USA, to ask: How do pre-fire stand age (i.e., pre-disturbance ecosystem state) and burn severity drive variability in initial post-fire legacies, specifically (1) aboveground biomass carbon and (2) fuel profiles? Dominant drivers of post-fire legacies varied by response variable, with pre-disturbance ecosystem state driving total legacy amounts and disturbance severity moderating legacy condition. Total post-fire carbon was ~3–4 times greater in mid- and late-seral stands compared to young stands. In unburned and low-severity fire stands, >70% of post-fire total carbon was live, and canopy fuel profiles were largely indistinguishable, suggesting greater continuity of structure and function following low-severity fire. Conversely, in high-severity stands, >95% of post-fire total carbon was dead and sparse canopy fuel remained. Regardless of burn severity, most biomass present pre-fire persisted following fire, suggesting high-carbon pre-fire stands lead to high-carbon post-fire stands (and vice versa). Persistence of legacy biomass in high-severity stands, even as it decays, will therefore buffer total ecosystem carbon storage as live carbon recovers over time. Further, all burned stands had considerable production of black carbon in charred wood biomass which can support ecosystem functioning and promote long-term carbon storage. Initial post-fire fuel profiles are likely sufficient to support fire in all stands, but reburn potential may be greater in high-severity stands due to rapid regeneration of flammable live surface vegetation and more exposed microclimatic conditions. Effects of fuel reduction from fire on mediating the occurrence and potential behavior of subsequent fires in high-productivity systems therefore appear short-lived. Our findings demonstrate the importance of pre-disturbance ecosystem state in dictating many aspects of initial post-disturbance structure and function, with important implications for managing post-fire recovery trajectories in some of Earth's most productive and high-biomass forests.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70479","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145652544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Some animals use stingers to repel attackers, and some predators have evolved tolerance to such stings, enabling them to consume venomous prey. For example, social wasps, such as hornets, use modified ovipositors as venomous stingers to inject venom, which can cause intense pain in humans. The world's largest hornet, Vespa mandarinia (Hymenoptera: Vespidae), stores huge amounts of venom in its abdomen, which can kill mammals. Although some animals are known to prey on adult hornets, it remains unclear whether these predators can avoid or tolerate their venomous stings. Adult hornets have been found in the stomach contents of some amphibian predators, including the pond frog Pelophylax nigromaculatus (Anura: Ranidae), suggesting that they can successfully attack and consume hornets. To examine whether frogs avoid or tolerate hornet stings, the pond frog P. nigromaculatus was experimentally presented with stinging females (workers) of three Japanese hornet (Vespa) species—V. simillima, V. analis, and V. mandarinia—under laboratory conditions. Almost all frogs attacked the hornets, and the hornets were observed stinging the frogs during these attacks. However, 93%, 87%, and 79% of the frogs ultimately consumed V. simillima, V. analis, and V. mandarinia, respectively. Hornet stings neither killed nor harmed the frogs. These results suggest a high tolerance of pond frogs to the venomous and painful stings of giant hornets. Frogs may serve as useful model organisms for investigating the physiological mechanisms underlying the intense pain and lethal effects of hornet stings in vertebrates.
{"title":"Pond frog as a predator of hornet workers: High tolerance to venomous stings","authors":"Shinji Sugiura","doi":"10.1002/ecs2.70457","DOIUrl":"https://doi.org/10.1002/ecs2.70457","url":null,"abstract":"<p>Some animals use stingers to repel attackers, and some predators have evolved tolerance to such stings, enabling them to consume venomous prey. For example, social wasps, such as hornets, use modified ovipositors as venomous stingers to inject venom, which can cause intense pain in humans. The world's largest hornet, <i>Vespa mandarinia</i> (Hymenoptera: Vespidae), stores huge amounts of venom in its abdomen, which can kill mammals. Although some animals are known to prey on adult hornets, it remains unclear whether these predators can avoid or tolerate their venomous stings. Adult hornets have been found in the stomach contents of some amphibian predators, including the pond frog <i>Pelophylax nigromaculatus</i> (Anura: Ranidae), suggesting that they can successfully attack and consume hornets. To examine whether frogs avoid or tolerate hornet stings, the pond frog <i>P</i>. <i>nigromaculatus</i> was experimentally presented with stinging females (workers) of three Japanese hornet (<i>Vespa</i>) species—<i>V</i>. <i>simillima</i>, <i>V</i>. <i>analis</i>, and <i>V</i>. <i>mandarinia</i>—under laboratory conditions. Almost all frogs attacked the hornets, and the hornets were observed stinging the frogs during these attacks. However, 93%, 87%, and 79% of the frogs ultimately consumed <i>V</i>. <i>simillima</i>, <i>V</i>. <i>analis</i>, and <i>V</i>. <i>mandarinia</i>, respectively. Hornet stings neither killed nor harmed the frogs. These results suggest a high tolerance of pond frogs to the venomous and painful stings of giant hornets. Frogs may serve as useful model organisms for investigating the physiological mechanisms underlying the intense pain and lethal effects of hornet stings in vertebrates.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70457","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145695134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hilde Karine Wam, Annika M. Felton, Adam Felton, Robert Spitzer, Märtha Wallgren
Cervid (Cervidae) populations that are overabundant with respect to their food resources are expected to show declining physiological and reproductive fitness. A proactive solution to such declines is to integrate the monitoring of food resources with animal harvesting strategies, but there are few studies available to guide managers regarding which food resources to monitor and how to do so. In this study, we used a large, rare data set that included detailed absolute measures of available food quantities and browsing intensity from field inventories, to test their relationship with fitness indices of moose Alces alces in 24 management units in four regions across Norway. We found that calf body mass and calves seen per cow during the autumn hunt were strongly and positively related to the availability of tree forage, especially the species most selected for by the study moose (e.g., rowan [Sorbus aucuparia] and sallow [Salix caprea]). The strength of the correlations varied between regions, apparently being stronger where the moose were closer to being overabundant or had a legacy of past overabundance. As expected, the intensity of browsing on the three most common tree species, that is, birch (Betula spp.), rowan, and pine (Pinus sylvestris), was also negatively and strongly related to the fitness. We discuss how our approach to food monitoring can facilitate a management that proactively adjusts densities of moose, and possibly other cervids, to trends in food availability and browsing intensity, thereby avoiding detrimental effects of overabundance.
{"title":"Food for fitness? Insights from 24 Norwegian moose populations for proactive monitoring and preventing overabundance","authors":"Hilde Karine Wam, Annika M. Felton, Adam Felton, Robert Spitzer, Märtha Wallgren","doi":"10.1002/ecs2.70476","DOIUrl":"https://doi.org/10.1002/ecs2.70476","url":null,"abstract":"<p>Cervid (Cervidae) populations that are overabundant with respect to their food resources are expected to show declining physiological and reproductive fitness. A proactive solution to such declines is to integrate the monitoring of food resources with animal harvesting strategies, but there are few studies available to guide managers regarding which food resources to monitor and how to do so. In this study, we used a large, rare data set that included detailed absolute measures of available food quantities and browsing intensity from field inventories, to test their relationship with fitness indices of moose <i>Alces alces</i> in 24 management units in four regions across Norway. We found that calf body mass and calves seen per cow during the autumn hunt were strongly and positively related to the availability of tree forage, especially the species most selected for by the study moose (e.g., rowan [<i>Sorbus aucuparia</i>] and sallow [<i>Salix caprea</i>]). The strength of the correlations varied between regions, apparently being stronger where the moose were closer to being overabundant or had a legacy of past overabundance. As expected, the intensity of browsing on the three most common tree species, that is, birch (<i>Betula</i> spp.), rowan, and pine (<i>Pinus sylvestris</i>), was also negatively and strongly related to the fitness. We discuss how our approach to food monitoring can facilitate a management that proactively adjusts densities of moose, and possibly other cervids, to trends in food availability and browsing intensity, thereby avoiding detrimental effects of overabundance.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70476","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145686275","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. E. Love, J. A. Fox, J. G. Hendrix, S. Jackson, K. Ferraro, E. Vander Wal, Q. M. R. Webber
Caribou are the only deer species in which females also grow antlers; however, female antler phenology differs from males. While males grow antlers over summer and hold them through autumn rut, female caribou grow antlers in summer and carry the polished antlers through winter until parturition. The pathways associated with antler casting in females are thought to be related to hormonal changes during parturition. Yet not every female caribou is pregnant every year, and nonpregnant individuals must use an alternative cue to trigger antler casting. While populations are typically consistent in antler phenology there are observations of individual caribou on an alternate phenological timeline, casting antlers 1 month prior to parturition, and growing velvety antlers a few inches long during the calving time. Indeed, this is generally observed in barren females and only in small proportions of the population. Here we report on observations of a population-wide change in antler phenology on Fogo Island, Newfoundland. We observed several individuals growing velvety antlers during calving season, earlier than in previous years, including two individuals who were confirmed to have been pregnant. Additionally, we observed no female caribou with polished antlers from the previous autumn, when our field observations generally show ~20% of the adult female caribou in this population have antlers at this time. We discuss possible disruptions to this phenology, commenting specifically on environmental cues that differed in comparison to years with the more commonly observed phenology. We found evidence suggesting that years with early antler casting had earlier spring warming, as well as earlier reduction in snow cover when compared to previous years. Our observations suggest that changing climate patterns may impact antler growth in female caribou, causing growth to coincide with the increased nutritional and energetic demands associated with gestation and lactation. Specifically, it is possible that earlier warming patterns may change which cue triggers antler growth in pregnant individuals from a hormonal cue during pregnancy to an environmental cue more commonly used by barren individuals. We highlight potential research questions surrounding the resilience of antler phenology, and the potential fitness consequences of disruptions to antler growth.
{"title":"Addled by antlers: Synchronous disruption to female caribou antler phenology","authors":"A. E. Love, J. A. Fox, J. G. Hendrix, S. Jackson, K. Ferraro, E. Vander Wal, Q. M. R. Webber","doi":"10.1002/ecs2.70484","DOIUrl":"https://doi.org/10.1002/ecs2.70484","url":null,"abstract":"<p>Caribou are the only deer species in which females also grow antlers; however, female antler phenology differs from males. While males grow antlers over summer and hold them through autumn rut, female caribou grow antlers in summer and carry the polished antlers through winter until parturition. The pathways associated with antler casting in females are thought to be related to hormonal changes during parturition. Yet not every female caribou is pregnant every year, and nonpregnant individuals must use an alternative cue to trigger antler casting. While populations are typically consistent in antler phenology there are observations of individual caribou on an alternate phenological timeline, casting antlers 1 month prior to parturition, and growing velvety antlers a few inches long during the calving time. Indeed, this is generally observed in barren females and only in small proportions of the population. Here we report on observations of a population-wide change in antler phenology on Fogo Island, Newfoundland. We observed several individuals growing velvety antlers during calving season, earlier than in previous years, including two individuals who were confirmed to have been pregnant. Additionally, we observed no female caribou with polished antlers from the previous autumn, when our field observations generally show ~20% of the adult female caribou in this population have antlers at this time. We discuss possible disruptions to this phenology, commenting specifically on environmental cues that differed in comparison to years with the more commonly observed phenology. We found evidence suggesting that years with early antler casting had earlier spring warming, as well as earlier reduction in snow cover when compared to previous years. Our observations suggest that changing climate patterns may impact antler growth in female caribou, causing growth to coincide with the increased nutritional and energetic demands associated with gestation and lactation. Specifically, it is possible that earlier warming patterns may change which cue triggers antler growth in pregnant individuals from a hormonal cue during pregnancy to an environmental cue more commonly used by barren individuals. We highlight potential research questions surrounding the resilience of antler phenology, and the potential fitness consequences of disruptions to antler growth.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70484","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145686276","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christopher I. Rounds, John Manske, Zachary S. Feiner, Jake R. Walsh, Catherine Polik, Gretchen J. A. Hansen
Winter plays a crucial role in structuring temperate aquatic ecosystems, influencing the timing of multiple processes across trophic levels including phytoplankton blooms, zooplankton production, and the reproduction and recruitment of fish species like walleye (Sander vitreus). Climate change is altering spring phenology in these ecosystems, leading to earlier and more variable ice breakup dates. Such disruptions can cause trophic mismatches, with potential consequences across trophic levels. In this study, we analyze long-term monitoring data on lake ice-off, phytoplankton abundance (1984–2024), zooplankton abundance (1984–2024), walleye spawn timing (1939–2024), and walleye recruitment and abundance (1984–2022) to examine how anomalous ice-off affects the phenology of primary producers, zooplankton, and walleye as well as the abundance of walleye at multiple life stages using generalized additive mixed models. Our findings show that anomalously early ice-off dates lead to earlier peak blooms of diatom, dinoflagellate, and chrysophyte phytoplankton, as well as earlier peaks in the abundance of cyclopoid, Daphnia, and Diaphanosoma zooplankters. Additionally, we observed shifts in the timing of walleye spawning and lower walleye abundance across life stages associated with early ice-off. Phytoplankton phenology was able to track ice-off when it was anomalously early, while zooplankton and fish spawning were not able to track anomalously early ice-off. Early ice-off led to a 22% reduction in age-0 walleye recruitment and a 14% decrease in adult walleye abundance compared to the long-term average ice-off. Our results highlight the cascading effects of early ice-off on the spring phenology of temperate lakes, leading to altered food webs and decreased fish recruitment and abundance. This study underscores the importance of understanding phenological shifts in freshwater ecosystems, as they have significant implications for ecosystem management and the sustainability of fish populations in a changing climate.
{"title":"Phenology, food webs, and fish: The effects of shifted ice phenology across multiple trophic levels","authors":"Christopher I. Rounds, John Manske, Zachary S. Feiner, Jake R. Walsh, Catherine Polik, Gretchen J. A. Hansen","doi":"10.1002/ecs2.70472","DOIUrl":"https://doi.org/10.1002/ecs2.70472","url":null,"abstract":"<p>Winter plays a crucial role in structuring temperate aquatic ecosystems, influencing the timing of multiple processes across trophic levels including phytoplankton blooms, zooplankton production, and the reproduction and recruitment of fish species like walleye (<i>Sander vitreus</i>). Climate change is altering spring phenology in these ecosystems, leading to earlier and more variable ice breakup dates. Such disruptions can cause trophic mismatches, with potential consequences across trophic levels. In this study, we analyze long-term monitoring data on lake ice-off, phytoplankton abundance (1984–2024), zooplankton abundance (1984–2024), walleye spawn timing (1939–2024), and walleye recruitment and abundance (1984–2022) to examine how anomalous ice-off affects the phenology of primary producers, zooplankton, and walleye as well as the abundance of walleye at multiple life stages using generalized additive mixed models. Our findings show that anomalously early ice-off dates lead to earlier peak blooms of diatom, dinoflagellate, and chrysophyte phytoplankton, as well as earlier peaks in the abundance of cyclopoid, Daphnia, and Diaphanosoma zooplankters. Additionally, we observed shifts in the timing of walleye spawning and lower walleye abundance across life stages associated with early ice-off. Phytoplankton phenology was able to track ice-off when it was anomalously early, while zooplankton and fish spawning were not able to track anomalously early ice-off. Early ice-off led to a 22% reduction in age-0 walleye recruitment and a 14% decrease in adult walleye abundance compared to the long-term average ice-off. Our results highlight the cascading effects of early ice-off on the spring phenology of temperate lakes, leading to altered food webs and decreased fish recruitment and abundance. This study underscores the importance of understanding phenological shifts in freshwater ecosystems, as they have significant implications for ecosystem management and the sustainability of fish populations in a changing climate.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70472","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145652578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chloé A. Blandino, Yannis P. Papastamatiou, Jonathan J. Dale, Carl G. Meyer
Intraguild predation (IGP) and competition significantly influence resource utilization patterns among sympatric species. The presence of alternative prey (consumed only by intraguild [IG] predators and not IG prey) may promote coexistence and have a profound effect on distribution patterns. Models predict that IG predator distributions should match alternative prey distribution when alternative prey are abundant. IG prey should risk-match by selecting safe habitats. When alternative prey are scarce, coexistence may be facilitated by a more even distribution across habitats or other mechanisms. Based on the models, the distribution of IG prey may be indirectly mediated by the alternative prey. French Frigate Shoals atoll, Hawaii, has a predator community that includes IG predators (tiger sharks), IG prey (gray reef sharks), and competitors (gray reef and Galapagos sharks, tiger and Galapagos sharks). Tiger sharks will consume alternative prey (fledgling seabirds) which occur in high abundance in the summer. We used acoustic telemetry of 128 sharks to test predictions of habitat use. As predicted by the model, tiger sharks showed a strong selection for islets where albatross fledge during the summer, whereas gray reef sharks avoided these areas and used other habitats. During the winter, tiger sharks showed a broader use of habitats and gray reef sharks showed a greater use of islets in lagoons. Galapagos sharks showed greater overlap with tiger sharks, but also avoided the summer islets where birds were fledging. Seabirds partially mediate habitat use by a shark community through their influence on a likely keystone species: tiger sharks. Our study highlights the importance of alternative prey and asymmetrical IGP in driving space-use patterns of marine predators.
{"title":"Seabirds mediate intraguild and competitive interactions in a shark community","authors":"Chloé A. Blandino, Yannis P. Papastamatiou, Jonathan J. Dale, Carl G. Meyer","doi":"10.1002/ecs2.70486","DOIUrl":"https://doi.org/10.1002/ecs2.70486","url":null,"abstract":"<p>Intraguild predation (IGP) and competition significantly influence resource utilization patterns among sympatric species. The presence of alternative prey (consumed only by intraguild [IG] predators and not IG prey) may promote coexistence and have a profound effect on distribution patterns. Models predict that IG predator distributions should match alternative prey distribution when alternative prey are abundant. IG prey should risk-match by selecting safe habitats. When alternative prey are scarce, coexistence may be facilitated by a more even distribution across habitats or other mechanisms. Based on the models, the distribution of IG prey may be indirectly mediated by the alternative prey. French Frigate Shoals atoll, Hawaii, has a predator community that includes IG predators (tiger sharks), IG prey (gray reef sharks), and competitors (gray reef and Galapagos sharks, tiger and Galapagos sharks). Tiger sharks will consume alternative prey (fledgling seabirds) which occur in high abundance in the summer. We used acoustic telemetry of 128 sharks to test predictions of habitat use. As predicted by the model, tiger sharks showed a strong selection for islets where albatross fledge during the summer, whereas gray reef sharks avoided these areas and used other habitats. During the winter, tiger sharks showed a broader use of habitats and gray reef sharks showed a greater use of islets in lagoons. Galapagos sharks showed greater overlap with tiger sharks, but also avoided the summer islets where birds were fledging. Seabirds partially mediate habitat use by a shark community through their influence on a likely keystone species: tiger sharks. Our study highlights the importance of alternative prey and asymmetrical IGP in driving space-use patterns of marine predators.</p>","PeriodicalId":48930,"journal":{"name":"Ecosphere","volume":"16 12","pages":""},"PeriodicalIF":2.9,"publicationDate":"2025-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://esajournals.onlinelibrary.wiley.com/doi/epdf/10.1002/ecs2.70486","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145652435","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}