Background: Why humans historically began to incorporate spices into their diets is still a matter of unresolved debate. For example, a recent study (Bromham et al. There is little evidence that spicy food in hot countries is an adaptation to reducing infection risk. Nat Hum Behav 2021;5:878-91.) did not support the most popular hypothesis that spice consumption was a practice favoured by selection in certain environments to reduce food poisoning, parasitic infections, and foodborne diseases.
Methods: Because several spices are known to have anticancer effects, we explored the hypothesis that natural selection and/or cultural evolution may have favoured spice consumption as an adaptive prophylactic response to reduce the burden of cancer pathology. We used linear models to investigate the potential relationship between age-standardized gastrointestinal cancer rates and spice consumption in 36 countries.
Results: Patterns of spice are not consistent with a cancer mitigation mechanism: the age-standardized rate of almost all gastrointestinal cancers was not related to spice consumption.
Conclusions: Direction other than foodborne pathogens and cancers should be explored to understand the health reasons, if any, why our ancestors developed a taste for spices.
Background and objectives: The processes by which pathogens evolve within a host dictate the efficacy of treatment strategies designed to slow antibiotic resistance evolution and influence population-wide resistance levels. The aim of this study is to describe the underlying genetic and phenotypic changes leading to antibiotic resistance within a patient who died as resistance evolved to available antibiotics. We assess whether robust patterns of collateral sensitivity and response to combinations existed that might have been leveraged to improve therapy.
Methodology: We used whole-genome sequencing of nine isolates taken from this patient over 279 days of a chronic infection with Enterobacter hormaechei, and systematically measured changes in resistance against five of the most relevant drugs considered for treatment.
Results: The entirety of the genetic change is consistent with de novo mutations and plasmid loss events, without acquisition of foreign genetic material via horizontal gene transfer. The nine isolates fall into three genetically distinct lineages, with early evolutionary trajectories being supplanted by previously unobserved multi-step evolutionary trajectories. Importantly, although the population evolved resistance to all the antibiotics used to treat the infection, no single isolate was resistant to all antibiotics. Evidence of collateral sensitivity and response to combinations therapy revealed inconsistent patterns across this diversifying population.
Conclusions: Translating antibiotic resistance management strategies from theoretical and laboratory data to clinical situations, such as this, will require managing diverse population with unpredictable resistance trajectories.
Recent research has characterized the behavioral defense against disease. In particular the detection of sickness cues, the adaptive reactions (e.g. avoidance) to these cues and the mediating role of disgust have been the focus. A presumably important but less investigated part of a behavioral defense is the immune system response of the observer of sickness cues. Odors are intimately connected to disease and disgust, and research has shown how olfaction conveys sickness cues in both animals and humans. This study aims to test whether odorous sickness cues (i.e. disgusting odors) can trigger a preparatory immune response in humans. We show that subjective and objective disgust measures, as well as TNFα levels in saliva increased immediately after exposure to disgusting odors in a sample of 36 individuals. Altogether, these results suggest a collaboration between behavioral mechanisms of pathogen avoidance in olfaction, mediated by the emotion of disgust, and mechanisms of pathogen elimination facilitated by inflammatory mediators. Disgusting stimuli are associated with an increased risk of infection. We here test whether disgusting odors, can trigger an immune response in the oral cavity. The results indicate an increase level of TNFα in the saliva. This supports that disease cues can trigger a preparatory response in the oral cavity.
Background and objectives: Testosterone plays an important role in regulating male development, reproduction and health. Declining levels across the lifespan may reflect, or even contribute to, chronic disease and mortality in men.
Methodology: Relationships between testosterone levels and male mortality were analyzed using data from multiple samples of the cross-sectional National Health and Nutrition Examination Survey (n = 10 225). Target outcomes included known deaths from heart disease, malignant neoplasms, chronic lower respiratory diseases, cerebrovascular diseases, Alzheimer's disease, diabetes mellitus, influenza and pneumonia, kidney diseases, and accidents or unintentional injuries.
Results: Results of discrete-time hazard models revealed that lower levels of testosterone were related to higher mortality for the majority of disease categories in either an age-dependent or age-independent fashion. Analysis of all-cause mortality-which included deaths from any known disease-also revealed greater general risk for those with lower testosterone levels. For most disease categories, the hazard associated with low testosterone was especially evident at older ages when mortality from that particular ailment was already elevated. Notably, testosterone levels were not related to mortality risk for deaths unrelated to chronic disease (i.e. accidents and injuries).
Conclusions and implications: While the causal direction of relationships between testosterone and mortality risk remains unclear, these results may reflect the decline in testosterone that accompanies many disease states. Accordingly, the relationship between testosterone and male mortality may be indirect; ill individuals are expected to have both lower testosterone and higher mortality risk.
Non-pharmaceutical interventions (NPIs), such as social distancing and contact tracing, are important public health measures that can reduce pathogen transmission. In addition to playing a crucial role in suppressing transmission, NPIs influence pathogen evolution by mediating mutation supply, restricting the availability of susceptible hosts, and altering the strength of selection for novel variants. Yet it is unclear how NPIs might affect the emergence of novel variants that are able to escape pre-existing immunity (partially or fully), are more transmissible or cause greater mortality. We analyse a stochastic two-strain epidemiological model to determine how the strength and timing of NPIs affect the emergence of variants with similar or contrasting life-history characteristics to the wild type. We show that, while stronger and timelier NPIs generally reduce the likelihood of variant emergence, it is possible for more transmissible variants with high cross-immunity to have a greater probability of emerging at intermediate levels of NPIs. This is because intermediate levels of NPIs allow an epidemic of the wild type that is neither too small (facilitating high mutation supply), nor too large (leaving a large pool of susceptible hosts), to prevent a novel variant from becoming established in the host population. However, since one cannot predict the characteristics of a variant, the best strategy to prevent emergence is likely to be an implementation of strong, timely NPIs.
Background and objectives: Sleep is a vulnerable state in which individuals are more susceptible to threat, which may have led to evolved mechanisms for increasing safety. The sentinel hypothesis proposes that brief awakenings during sleep may be a strategy for detecting and responding to environmental threats. Observations of sleep segmentation and group sentinelization in hunter-gatherer and small-scale communities support this hypothesis, but to date it has not been tested in comparisons with industrial populations characterized by more secure sleep environments.
Methodology: Here, we compare wake after sleep onset (WASO), a quantitative measure of nighttime awakenings, between two nonindustrial and two industrial populations: Hadza hunter-gatherers (n = 33), Malagasy small-scale agriculturalists (n = 38), and Hispanic (n = 1,531) and non-Hispanic White (NHW) (n = 347) Americans. We compared nighttime awakenings between these groups using actigraphically-measured sleep data. We fit linear models to assess whether WASO varies across groups, controlling for sex and age.
Results: We found that WASO varies significantly by group membership and is highest in Hadza (2.44 h) and Malagasy (1.93 h) and lowest in non-Hispanic Whites (0.69 h). Hispanics demonstrate intermediate WASO (0.86 h), which is significantly more than NHW participants. After performing supplementary analysis within the Hispanic sample, we found that WASO is significantly and positively associated with increased perception of neighborhood violence.
Conclusions and implications: Consistent with principles central to evolutionary medicine, we propose that evolved mechanisms to increase vigilance during sleep may now be mismatched with relatively safer environments, and in part responsible for driving poor sleep health.
Objectives/aims: Prolonged infections of immunocompromised individuals have been proposed as a crucial source of new variants of SARS-CoV-2 during the COVID-19 pandemic. In principle, sustained within-host antigenic evolution in immunocompromised hosts could allow novel immune escape variants to emerge more rapidly, but little is known about how and when immunocompromised hosts play a critical role in pathogen evolution.
Materials and methods: Here, we use a simple mathematical model to understand the effects of immunocompromised hosts on the emergence of immune escape variants in the presence and absence of epistasis.
Conclusions: We show that when the pathogen does not have to cross a fitness valley for immune escape to occur (no epistasis), immunocompromised individuals have no qualitative effect on antigenic evolution (although they may accelerate immune escape if within-host evolutionary dynamics are faster in immunocompromised individuals). But if a fitness valley exists between immune escape variants at the between-host level (epistasis), then persistent infections of immunocompromised individuals allow mutations to accumulate, therefore, facilitating rather than simply speeding up antigenic evolution. Our results suggest that better genomic surveillance of infected immunocompromised individuals and better global health equality, including improving access to vaccines and treatments for individuals who are immunocompromised (especially in lower- and middle-income countries), may be crucial to preventing the emergence of future immune escape variants of SARS-CoV-2.
CD44 is an extracellular matrix receptor implicated in cancer progression. CD44 increases the invasibility of skin (SF) and endometrial stromal fibroblasts (ESF) by cancer and trophoblast cells. We reasoned that the evolution of CD44 expression can affect both, the fetal-maternal interaction through CD44 in ESF as well as vulnerability to malignant cancer through expression in SF. We studied the evolution of CD44 expression in mammalian SF and ESF and demonstrate that in the human lineage evolved higher CD44 expression. Isoform expression in cattle and human is very similar suggesting that differences in invasibility are not due to the nature of expressed isoforms. We then asked whether the concerted gene expression increase in both cell types is due to shared regulatory mechanisms or due to cell type-specific factors. Reporter gene experiments with cells and cis-regulatory elements from human and cattle show that the difference of CD44 expression is due to cis effects as well as cell type-specific trans effects. These results suggest that the concerted expression increase is likely due to selection acting on both cell types because the evolutionary change in cell type-specific factors requires selection on cell type-specific functions. This scenario implies that the malignancy enhancing effects of elevated CD44 expression in humans likely evolved as a side-effect of positive selection on a yet unidentified other function of CD44. A possible candidate is the anti-fibrotic effect of CD44 but there are no reliable data showing that humans and primates are less fibrotic than other mammals.