Respiration has been shown to impact memory retrieval, yet the neural dynamics underlying this effect remain unclear. Here, we investigated how respiration shapes both behavioral and neural expressions of memory retrieval by reanalyzing an existing dataset where scalp electroencephalography and respiration recordings were acquired while participants (N = 18, 15 females) performed an episodic memory task. Our results unveil that respiration influences retrieval-related power fluctuations in the α/β band and concomitant memory reactivation. Specifically, we found that both key neural signatures of successful remembering were comodulated during exhalation, with the strength of the interaction between respiration and reactivation processes being associated with memory performance. Together, these findings suggest that respiration may act as a scaffold for episodic memory retrieval in humans by coordinating the neural conditions that support effective remembering.
Predictive cues significantly influence perception through associative learning. However, it is unknown whether circuits are conserved across domains. We investigated how associative learning influences perceived intensity and valence of pain and hedonic taste and whether expectancy-based modulation varies by aversiveness or modality. Sixty participants (37 females, 23 males) were randomly assigned to receive either painful heat, unpleasant liquid saline, or pleasant liquid sucrose during fMRI scanning. Following conditioning, cues initially associated with low- or high-intensity outcomes were intermittently followed by stimuli calibrated to elicit medium-intensity ratings. Learned cues modulated expectations and subjective outcomes similarly across domains. Consistent with this, the orbitofrontal cortex exhibited domain-general anticipatory activation. Cue effects on perceived intensity and valence were mediated by the left anterior insula and thalamus, respectively-regions closely overlapping those identified in prior studies of pain expectancy (Atlas et al., 2010). Pain specificity was evident when we measured variations in stimulus intensity, whether we used univariate or multivariate approaches, but there was minimal evidence of specificity by modality or aversiveness in cue effects on medium trials. These findings suggest that shared neural circuits mediate the effects of learned expectations on perception, linking pain with other areas of affective processing and perception across domains.
We rely on the working memory (WM) to organize, store, and process the perpetual stream of information. Efficient encoding and processing of WM requires a framework that (1) separates individual memory items while accurately maintaining their temporal rank and (2) updates the sequence by discarding no-longer-needed items and accommodating newly arrived ones. To investigate the computational mechanisms underlying this functional implementation of WM, we analyzed the neural information representation in both a recurrent neural network (RNN) model and human subjects (n = 28, 18 males) under the same N-back WM task, which necessitates continuous encoding and updating of memory items. We discovered that an orthogonal-rotational dynamical framework facilitates memory encoding and updating, allowing both the RNN and brain to organize memory items efficiently. In the RNN model, we identified an orthogonal coding space where each memory item occupies a subspace corresponding to its ordinal rank. A rotational operation dynamically transfers information across these subspaces, updating memory while preserving their internal order. Overall, this orthogonal-rotational framework enables the network to store the information in a "first in, first out" manner. Remarkably, we also observed similar orthogonal-rotational dynamics in EEG signals recorded from the prefrontal areas of human participants engaged in the same task. These findings suggest a novel mechanism underlying the brain's ability to efficiently organize information stream for "online" processing and indicate that this strategy may be utilized by both biological and artificial neural networks for optimal information storage and updating.
Regular rhythmic activity typically produces stereotypical synaptic responses, masking dynamics due to short-term synaptic plasticity (STP). Multiple-frequency (e.g., Poisson-like) inputs unveil canonical STP effects where facilitation or depression, respectively, favor high- or low-frequency inputs, and a mix of both favors intermediate frequencies. Notably, regular activity with multiple oscillatory components can produce synaptic responses that are not readily surmisable from canonical STP responses. In the responses of rhythmically activated muscles of the lobster (Homarus americanus), of either sex, slow modulation of bursting inputs, consisting of periodic changes in burst frequency and spike number, is amplified by dynamic neuromuscular synapses. Using a simple STP model, we demonstrate that facilitation enhances the difference (contrast) in responses to strong and weak bursts, while depression diminishes it. Nonintuitively, such changes in contrast imply that high-pass filtering enhances low-frequency components of the modulated bursting, whereas low-pass filtering attenuates them. For mixtures of facilitation and depression, our modeling results suggest a complex dependence of the readout of slow modulation on overall release probability and recovery times for vesicle depletion and calcium accumulation. Notably, these effects are reduced when the recovery time of STP exceeds the burst period and thereby allows a memory of prior activity across consecutive bursts. Additionally, with memory across bursts, response contrast does not change proportionally with input contrast and depends on the number of bursts per slow modulation cycle. Finally, a biophysical model of a postsynaptic cell demonstrates that simple subthreshold voltage-gated conductances can substantially contribute to the readout of low-frequency modulation.
Neurons in the central nucleus of the inferior colliculus exhibit spatial receptive fields due to underlying neural sensitivity to acoustic cues that covary with sound source location, including interaural time difference (ITD), interaural level difference (ILD), and spectral shape and average acoustic gain within each ear. While neural sensitivity to individual cues is generally known, what remains unknown is how individual cues contribute to a neuron's receptive field when all cues are combined and how these contributions vary with the neuron's characteristic frequency (CF). In the present study, broadband noise stimuli were presented to awake rabbits of both sexes in virtual acoustic space using the rabbit's own head-related transfer functions. Contributions of each cue to the azimuth tuning curve (i.e., the receptive field within the front horizontal plane) were assessed by manipulating transfer functions to fix some cues while allowing others to vary naturally with azimuth. On average, firing rates of low-CF neurons (<2.8 kHz) were determined by the combination of ITD and one or more of ILD and individual-ear acoustic gains, whereas rates of high-CF neurons (>2.8 kHz) were largely determined by either ILD; contralateral-ear spectrum and ILD; or a combination of ITD and non-ITD cues, depending on whether source location was ipsilateral to the recording site, contralateral, or straight ahead, respectively. The CF transition coincided with the acoustic frequency above which the range of ILDs rapidly expands. Despite CF-dependent differences in the contributions of localization cues, rate sensitivity to azimuth was the same, on average, across the tonotopic axis.
Humans and other primates can robustly report whether they have seen specific images before, even when those images are extremely similar to ones they have previously seen. Multiple lines of evidence suggest that pattern separation computations in the hippocampus (HC) contribute to this behavior by shaping the fidelity of visual memory. However, unclear is whether HC uniquely determines memory fidelity or whether computations in other brain areas also contribute. To investigate, we recorded neural signals from inferotemporal cortex (ITC) and HC of two rhesus monkeys (1 male, 1 female) as they performed a memory task in which they judged whether images were novel or exactly repeated in the presence of visually similar lure images with a range of visual similarities. We found behavioral evidence for sharpening, reflected as memory performance that was nonlinearly transformed relative to a benchmark defined by visual representations in ITC. As expected, we found that behavioral sharpening aligned with visual memory representations in HC. Surprisingly, and unaccounted for by HC pattern separation proposals, we also found neural correlates of behavioral sharpening reflected in ITC. These results, coupled with further analysis of the data, suggest that ITC contributes to shaping the fidelity of visual memory in the transformation from visual processing to memory storage and signaling.

