Disease is a key driver of community and ecosystem structure, especially when it strikes foundation species. In the widespread marine foundation species eelgrass (Zostera marina), outbreaks of wasting disease have caused large-scale meadow collapse in the past, and the causative pathogen, Labyrinthula zosterae, is commonly found in meadows globally. Research to date has mainly focused on abiotic environmental drivers of seagrass wasting disease, but there is strong evidence from other systems that biotic interactions such as herbivory can facilitate plant diseases. How biotic interactions influence seagrass wasting disease in the field is unknown but is potentially important for understanding dynamics of this globally valuable and declining habitat. Here, we investigated links between epifaunal grazers and seagrass wasting disease using a latitudinal field study across 32 eelgrass meadows distributed from southeastern Alaska to southern California. From 2019 to 2021, we conducted annual surveys to assess eelgrass shoot density, morphology, epifauna community, and the prevalence and lesion area of wasting disease infections. We integrated field data with satellite measurements of sea surface temperature and used structural equation modeling to test the magnitude and direction of possible drivers of wasting disease. Our results show that grazing by small invertebrates was associated with a 29% increase in prevalence of wasting disease infections and that both the prevalence and lesion area of disease increased with total epifauna abundances. Furthermore, these relationships differed among taxa; disease levels increased with snail (Lacuna spp.) and idoteid isopod abundances but were not related to abundance of ampithoid amphipods. This field study across 23° of latitude suggests a prominent role for invertebrate consumers in facilitating disease outbreaks with potentially large impacts on coastal seagrass ecosystems.
Temperate streams are subsidized by inputs of leaf litter peaking in fall. Yet, stream communities decompose dead leaves and integrate their energy into the aquatic food web throughout the whole year. Most studies investigating stream decomposition largely overlook long-term trajectories, which must be understood for an appropriate temporal upscaling of ecosystem processes. Using mesocosms, we quantified changes in carbon, nitrogen, and phosphorus content of three leaf species during decomposition at weekly to multi-month intervals for up to a year; then, we tested how decomposition duration affected the subsequent consumption by a keystone amphipod macroinvertebrate. Over a year, nitrogen and phosphorus percentage increased across all leaf species, but only the recalcitrant species maintained initial levels of absolute nitrogen and phosphorus. Prolonged decomposition barely affected or impaired amphipod consumption of labile leaf species, whereas it enhanced feeding on the recalcitrant species. Overall, we demonstrate that recalcitrant leaves might serve as longer stored potential resources for when labile species have already been consumed and that their increasing palatability observed over multi-month intervals of sustained decomposition may stabilize fluctuations in the rates of leaf litter integration into aquatic food webs. This yearlong perspective highlights the relevancy of slow-decomposing leaves for aquatic detrital communities.
How consumer diversity determines consumption efficiency is a central issue in ecology. In the context of predation and biological control, this relationship concerns predator diversity and predation efficiency. Reduced predation efficiency can result from different predator taxa eating each other in addition to their common prey (interference due to intraguild predation). By contrast, multiple predator taxa with overlapping but complementary feeding niches can generate increased predation efficiency on their common prey (enemy complementarity). When viewed strictly from an ecological perspective, intraguild predation and enemy complementarity are opposing forces. However, from an evolutionary ecology perspective, predators facing strong intraguild predation may evolve traits that reduce their predation risk, possibly leading to niche complementarity between enemies; thus, selection from intraguild predation may lead to enemy complementarity rather than opposing it. As specialized predators that live in or on their hosts, parasitoids are subjected to intraguild predation from generalist predators that consume the parasitoids' hosts. The degree to which parasitoid–predator interactions are ruled by interference versus enemy complementarity has been debated. Here, we address this issue with field experiments in a forest community consisting of multiple species of trees, herbivorous caterpillars, parasitoids, ants, and birds. Our experiments and analyses found no interference effects, but revealed clear evidence for complementarity between parasitoids and birds (not ants). Parasitism rates by hymenopterans and dipterans were negatively associated with bird predation risk, and the variation in the strength of this negative association suggests that this enemy complementarity was due to parasitoid avoidance of intraguild predation. We further argue that avoidance of intraguild predation by parasitoids and other arthropod predators may explain enigmatic patterns in vertebrate–arthropod–plant food webs in a variety of terrestrial ecosystems.
Priority effects, the effects of early-arriving species on late-arriving species, are caused by niche preemption and/or niche modification. The strength of priority effects can be determined by the extent of niche preemption and/or modification by the early-arriving species; however, the strength of priority effects may also be influenced by the late-arriving species, as some species may be better adapted to deal with niche preemption and/or modification. Therefore, some combinations of species will likely lead to stronger priority effects than others. We tested priority effects for all pairwise combinations of 15 plant species, including grasses, legumes, and nonleguminous forbs, by comparing simultaneous and sequential arrival orders in a 10-week-long, controlled, pot experiment. We did this by using the competitive effect and response framework, quantifying the ability to suppress a neighbor as the competitive effect and the ability to tolerate a neighbor as the competitive response. We found that when arriving simultaneously, species that caused strong competitive effects also had weaker competitive responses. When arriving sequentially, species that caused strong priority effects when arriving early also had weaker responses to priority effects when arriving late. Among plant functional groups, legumes had the weakest response to priority effects. We also measured plant functional traits related to the plant economic spectrum, which were combined into a principal components analysis (PCA) where the first axis represented a conservative-to-acquisitive trait gradient. Using the PCA species scores, we showed that both the traits of the focal and the neighboring species determined the outcome of competition. Trait dissimilarities between the focal and neighboring species were more important when species arrived sequentially than when species arrived simultaneously. Specifically, priority effects only became weaker when the late-arriving species was more acquisitive than the early-arriving species. Together, our findings show that traits and specifically the interaction of traits between species are more important in determining competition outcomes when species arrive sequentially (i.e., with priority effects present) than when arriving simultaneously.
To limit damage from insect herbivores, plants rely on a blend of defensive mechanisms that includes partnerships with beneficial microbes, particularly those inhabiting roots. While ample evidence exists for microbially mediated resistance responses that directly target insects through changing phytotoxin and volatile profiles, we know surprisingly little about the microbial underpinnings of plant tolerance. Tolerance defenses counteract insect damage via shifts in plant physiology that reallocate resources to fuel compensatory growth, improve photosynthetic efficiency, and reduce oxidative stress. Despite being a powerful mitigator of insect damage, tolerance remains an understudied realm of plant defenses. Here, we propose a novel conceptual framework that can be broadly applied across study systems to characterize microbial impacts on expression of tolerance defenses. We conducted a systematic review of studies quantifying the impact of rhizosphere microbial inoculants on plant tolerance to herbivory based on several measures—biomass, oxidative stress mitigation, or photosynthesis. We identified 40 studies, most of which focused on chewing herbivores (n = 31) and plant growth parameters (e.g., biomass). Next, we performed a meta-analysis investigating the impact of microbial inoculants on plant tolerance to herbivory, which was measured via differences in plant biomass, and compared across key microbe, insect, and plant traits. Thirty-five papers comprising 113 observations were included in this meta-analysis, with effect sizes (Hedges' d) ranging from −4.67 (susceptible) to 18.38 (overcompensation). Overall, microbial inoculants significantly reduce the cost of herbivory via plant growth promotion, with overcompensation and compensation comprising 25% of observations of microbial-mediated tolerance. The grand mean effect size 0.99 [0.49; 1.49] indicates that the addition of a microbial inoculant increased plant biomass by ~1 SD under herbivore stress, thus improving tolerance. This effect was influenced most by microbial attributes, including functional guild and total soil community diversity. Overall, results highlight the need for additional investigation of microbially mediated plant tolerance, particularly in sap-feeding insects and across a more comprehensive range of tolerance mechanisms. Such attention would round out our current understanding of anti-herbivore plant defenses, offer insight into the underlying mechanisms that promote resilience to insect stress, and inform the application of microbial biotechnology to support sustainable agricultural practices.
The trait-based partitioning of species plays a critical role in biodiversity–ecosystem function relationships. This niche partitioning drives and depends on community structure, yet this link remains elusive in the context of a metacommunity, where local community assembly is dictated by regional dispersal alongside local environmental conditions. Hence, elucidating the coupling of niche partitioning and community structure needs spatially explicit studies. Such studies are particularly necessary in river networks, where local habitats are highly connected by unidirectional water flow in a spatially complex network structure and frequent disturbance makes community structure strongly dependent on recolonization. Here, we show that taxonomic turnover among periphyton communities colonizing deployed bricks (microhabitats) at multiple sampling sites (local habitats) in a river network came along with a turnover in traits. This niche partitioning showed a hump-shaped relationship with richness of periphyton communities, which increased along river size. Our observations suggest downstream dispersal along the river network to increase the regional metacommunity pool, which then ensures local colonization by taxa possessing diverse traits allowing them to efficiently partition into environmentally different microhabitats. However, at the most downstream sites, the excessive dispersal of widespread generalists drove mass effects which inflated richness with taxa that co-occupied several microhabitats and swamped niche partitioning. Further, efficient niche partitioning depended on communities rich in rare taxa, an indication for the importance of specialists. Alarmingly, richness and rare taxa declined with high phosphorus concentrations and conductivity, respectively, two environmental variables which potentially reflected anthropogenic activity.
The exact mechanisms behind population cycles remain elusive. An ongoing debate centers on whether predation by small mustelids is necessary and sufficient to generate rodent cycles, as stipulated by the specialist predator hypothesis (SPH). Specifically, the SPH predicts that the predator should respond numerically to the abundance of its prey with a delay of approximately one year, leading to delayed density-dependence in the dynamics of the prey population. Here, we analyze the numerical response of a small mustelid, the seasonality of its interaction with rodents, and its impact on population cycles using long-term seasonal data on ermines and cyclic lemmings in the High Arctic. Our results show that the numerical response of ermines to lemming fluctuations was delayed by one year and could mediate delayed density-dependence in lemming growth rate. The impact of ermines on the growth rate of lemmings was small but mostly circumscribed to winter, a critical period when shifts in cycle phases occur and direct density-dependence seems relaxed. Our simulations of lemming population with and without ermines suggest that these small mustelids are neither necessary, nor sufficient to generate cycles per se. However, the presence of small mustelids may be necessary to prolong the low-abundance phase and delay the recovery of lemming populations, promoting the presence of a multiannual low phase typical of lemming cycles. Our study corroborates the idea that population declines of cyclic populations are best explained by direct density-dependence; however, the delayed response of specialized predators induces the multiannual low phase and leads to longer periodicities, which are typically of 3–5 years in rodents.