Pub Date : 2023-04-25DOI: 10.1101/2022.06.30.498324
Laurent Caplette, K. Jerbi, F. Gosselin
When we fixate an object, visual information is continuously received on the retina. Several studies observed behavioral oscillations in perceptual sensitivity across such stimulus time, and these fluctuations have been linked to brain oscillations. However, whether specific brain areas show oscillations across stimulus time (i.e., different time points of the stimulus being more or less processed, in a rhythmic fashion) has not been investigated. Here, we revealed random areas of face images at random moments across time and recorded the brain activity of male and female human participants using MEG while they performed two recognition tasks. This allowed us to quantify how each snapshot of visual information coming from the stimulus is processed across time and across the brain. Oscillations across stimulus time (rhythmic sampling) were mostly visible in early visual areas, at theta, alpha, and low beta frequencies. We also found that they contributed to brain activity more than previously investigated rhythmic processing (oscillations in the processing of a single snapshot of visual information). Nonrhythmic sampling was also visible at later latencies across the visual cortex, either in the form of a transient processing of early stimulus time points or of a sustained processing of the whole stimulus. Our results suggest that successive cycles of ongoing brain oscillations process stimulus information incoming at successive moments. Together, these results advance our understanding of the oscillatory neural dynamics associated with visual processing and show the importance of considering the temporal dimension of stimuli when studying visual recognition. SIGNIFICANCE STATEMENT Several behavioral studies have observed oscillations in perceptual sensitivity over the duration of stimulus presentation, and these fluctuations have been linked to brain oscillations. However, oscillations across stimulus time in the brain have not been studied. Here, we developed an MEG paradigm to quantify how visual information received at each moment during fixation is processed through time and across the brain. We showed that different snapshots of a stimulus are distinctly processed in many brain areas and that these fluctuations are oscillatory in early visual areas. Oscillations across stimulus time were more prevalent than previously studied oscillations across processing time. These results increase our understanding of how neural oscillations interact with the visual processing of temporal stimuli.
{"title":"Rhythmic Information Sampling in the Brain during Visual Recognition","authors":"Laurent Caplette, K. Jerbi, F. Gosselin","doi":"10.1101/2022.06.30.498324","DOIUrl":"https://doi.org/10.1101/2022.06.30.498324","url":null,"abstract":"When we fixate an object, visual information is continuously received on the retina. Several studies observed behavioral oscillations in perceptual sensitivity across such stimulus time, and these fluctuations have been linked to brain oscillations. However, whether specific brain areas show oscillations across stimulus time (i.e., different time points of the stimulus being more or less processed, in a rhythmic fashion) has not been investigated. Here, we revealed random areas of face images at random moments across time and recorded the brain activity of male and female human participants using MEG while they performed two recognition tasks. This allowed us to quantify how each snapshot of visual information coming from the stimulus is processed across time and across the brain. Oscillations across stimulus time (rhythmic sampling) were mostly visible in early visual areas, at theta, alpha, and low beta frequencies. We also found that they contributed to brain activity more than previously investigated rhythmic processing (oscillations in the processing of a single snapshot of visual information). Nonrhythmic sampling was also visible at later latencies across the visual cortex, either in the form of a transient processing of early stimulus time points or of a sustained processing of the whole stimulus. Our results suggest that successive cycles of ongoing brain oscillations process stimulus information incoming at successive moments. Together, these results advance our understanding of the oscillatory neural dynamics associated with visual processing and show the importance of considering the temporal dimension of stimuli when studying visual recognition. SIGNIFICANCE STATEMENT Several behavioral studies have observed oscillations in perceptual sensitivity over the duration of stimulus presentation, and these fluctuations have been linked to brain oscillations. However, oscillations across stimulus time in the brain have not been studied. Here, we developed an MEG paradigm to quantify how visual information received at each moment during fixation is processed through time and across the brain. We showed that different snapshots of a stimulus are distinctly processed in many brain areas and that these fluctuations are oscillatory in early visual areas. Oscillations across stimulus time were more prevalent than previously studied oscillations across processing time. These results increase our understanding of how neural oscillations interact with the visual processing of temporal stimuli.","PeriodicalId":22786,"journal":{"name":"The Journal of Neuroscience","volume":"17 1","pages":"4487 - 4497"},"PeriodicalIF":0.0,"publicationDate":"2023-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80069878","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-19DOI: 10.1101/2023.04.19.537469
Alexander Enge, Franziska Süß, Rasha Abdel Rahman
Does our perception of an object change once we discover what function it serves? We showed human participants (n = 48, 31 females and 17 males) pictures of unfamiliar objects either together with keywords matching their function, leading to semantically informed perception, or together with nonmatching keywords, resulting in uninformed perception. We measured event-related potentials to investigate at which stages in the visual processing hierarchy these two types of object perception differed from one another. We found that semantically informed compared with uninformed perception was associated with larger amplitudes in the N170 component (150-200 ms), reduced amplitudes in the N400 component (400-700 ms), and a late decrease in alpha/beta band power. When the same objects were presented once more without any information, the N400 and event-related power effects persisted, and we also observed enlarged amplitudes in the P1 component (100-150 ms) in response to objects for which semantically informed perception had taken place. Consistent with previous work, this suggests that obtaining semantic information about previously unfamiliar objects alters aspects of their lower-level visual perception (P1 component), higher-level visual perception (N170 component), and semantic processing (N400 component, event-related power). Our study is the first to show that such effects occur instantly after semantic information has been provided for the first time, without requiring extensive learning. SIGNIFICANCE STATEMENT There has been a long-standing debate about whether or not higher-level cognitive capacities, such as semantic knowledge, can influence lower-level perceptual processing in a top-down fashion. Here we could show, for the first time, that information about the function of previously unfamiliar objects immediately influences cortical processing within less than 200 ms. Of note, this influence does not require training or experience with the objects and related semantic information. Therefore, our study is the first to show effects of cognition on perception while ruling out the possibility that prior knowledge merely acts by preactivating or altering stored visual representations. Instead, this knowledge seems to alter perception online, thus providing a compelling case against the impenetrability of perception by cognition.
{"title":"Instant Effects of Semantic Information on Visual Perception","authors":"Alexander Enge, Franziska Süß, Rasha Abdel Rahman","doi":"10.1101/2023.04.19.537469","DOIUrl":"https://doi.org/10.1101/2023.04.19.537469","url":null,"abstract":"Does our perception of an object change once we discover what function it serves? We showed human participants (n = 48, 31 females and 17 males) pictures of unfamiliar objects either together with keywords matching their function, leading to semantically informed perception, or together with nonmatching keywords, resulting in uninformed perception. We measured event-related potentials to investigate at which stages in the visual processing hierarchy these two types of object perception differed from one another. We found that semantically informed compared with uninformed perception was associated with larger amplitudes in the N170 component (150-200 ms), reduced amplitudes in the N400 component (400-700 ms), and a late decrease in alpha/beta band power. When the same objects were presented once more without any information, the N400 and event-related power effects persisted, and we also observed enlarged amplitudes in the P1 component (100-150 ms) in response to objects for which semantically informed perception had taken place. Consistent with previous work, this suggests that obtaining semantic information about previously unfamiliar objects alters aspects of their lower-level visual perception (P1 component), higher-level visual perception (N170 component), and semantic processing (N400 component, event-related power). Our study is the first to show that such effects occur instantly after semantic information has been provided for the first time, without requiring extensive learning. SIGNIFICANCE STATEMENT There has been a long-standing debate about whether or not higher-level cognitive capacities, such as semantic knowledge, can influence lower-level perceptual processing in a top-down fashion. Here we could show, for the first time, that information about the function of previously unfamiliar objects immediately influences cortical processing within less than 200 ms. Of note, this influence does not require training or experience with the objects and related semantic information. Therefore, our study is the first to show effects of cognition on perception while ruling out the possibility that prior knowledge merely acts by preactivating or altering stored visual representations. Instead, this knowledge seems to alter perception online, thus providing a compelling case against the impenetrability of perception by cognition.","PeriodicalId":22786,"journal":{"name":"The Journal of Neuroscience","volume":"10 1","pages":"4896 - 4906"},"PeriodicalIF":0.0,"publicationDate":"2023-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84239407","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-19DOI: 10.1523/JNEUROSCI.twij.43.16.2023
{"title":"This Week in The Journal","authors":"","doi":"10.1523/JNEUROSCI.twij.43.16.2023","DOIUrl":"https://doi.org/10.1523/JNEUROSCI.twij.43.16.2023","url":null,"abstract":"","PeriodicalId":22786,"journal":{"name":"The Journal of Neuroscience","volume":"770 1","pages":"2818 - 2818"},"PeriodicalIF":0.0,"publicationDate":"2023-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76941012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-16DOI: 10.1101/2023.04.16.537084
Anna M Lipkin, K. Bender
Neurons are remarkably polarized structures: dendrites spread and branch to receive synaptic inputs while a single axon extends and transmits action potentials (APs) to downstream targets. Neuronal polarity is maintained by the axon initial segment (AIS), a region between the soma and axon proper that is also the site of action potential (AP) generation. This polarization between dendrites and axons extends to inhibitory neurotransmission. In adulthood, the neurotransmitter GABA hyperpolarizes dendrites but instead depolarizes axons. These differences in function collide at the AIS. Multiple studies have shown that GABAergic signaling in this region can share properties of either the mature axon or mature dendrite, and that these properties evolve over a protracted period encompassing periadolescent development. Here, we explored how developmental changes in GABAergic signaling affect AP initiation. We show that GABA at the axon initial segment inhibits action potential initiation in layer (L)2/3 pyramidal neurons in prefrontal cortex from mice of either sex across GABA reversal potentials observed in periadolescence. These actions occur largely through current shunts generated by GABAA receptors and changes in voltage-gated channel properties that affected the number of channels that could be recruited for AP electrogenesis. These results suggest that GABAergic neurons targeting the axon initial segment provide an inhibitory “veto” across the range of GABA polarity observed in normal adolescent development, regardless of GABAergic synapse reversal potential. Significance Statement GABA receptors are a major class of neurotransmitter receptors in the brain. Typically, GABA receptors inhibit neurons by allowing influx of negatively charged chloride ions into the cell. However, there are cases where local chloride concentrations promote chloride efflux through GABA receptors. Such conditions exist early in development in neocortical pyramidal cell axon initial segments (AISs), where action potentials (APs) initiate. Here, we examined how chloride efflux in early development interacts with mechanisms that support action potential initiation. We find that this efflux, despite moving membrane potential closer to action potential threshold, is nevertheless inhibitory. Thus, GABA at the axon initial segment is likely to be inhibitory for action potential initiation independent of whether chloride flows out or into neurons via these receptors.
{"title":"Axon Initial Segment GABA Inhibits Action Potential Generation throughout Periadolescent Development","authors":"Anna M Lipkin, K. Bender","doi":"10.1101/2023.04.16.537084","DOIUrl":"https://doi.org/10.1101/2023.04.16.537084","url":null,"abstract":"Neurons are remarkably polarized structures: dendrites spread and branch to receive synaptic inputs while a single axon extends and transmits action potentials (APs) to downstream targets. Neuronal polarity is maintained by the axon initial segment (AIS), a region between the soma and axon proper that is also the site of action potential (AP) generation. This polarization between dendrites and axons extends to inhibitory neurotransmission. In adulthood, the neurotransmitter GABA hyperpolarizes dendrites but instead depolarizes axons. These differences in function collide at the AIS. Multiple studies have shown that GABAergic signaling in this region can share properties of either the mature axon or mature dendrite, and that these properties evolve over a protracted period encompassing periadolescent development. Here, we explored how developmental changes in GABAergic signaling affect AP initiation. We show that GABA at the axon initial segment inhibits action potential initiation in layer (L)2/3 pyramidal neurons in prefrontal cortex from mice of either sex across GABA reversal potentials observed in periadolescence. These actions occur largely through current shunts generated by GABAA receptors and changes in voltage-gated channel properties that affected the number of channels that could be recruited for AP electrogenesis. These results suggest that GABAergic neurons targeting the axon initial segment provide an inhibitory “veto” across the range of GABA polarity observed in normal adolescent development, regardless of GABAergic synapse reversal potential. Significance Statement GABA receptors are a major class of neurotransmitter receptors in the brain. Typically, GABA receptors inhibit neurons by allowing influx of negatively charged chloride ions into the cell. However, there are cases where local chloride concentrations promote chloride efflux through GABA receptors. Such conditions exist early in development in neocortical pyramidal cell axon initial segments (AISs), where action potentials (APs) initiate. Here, we examined how chloride efflux in early development interacts with mechanisms that support action potential initiation. We find that this efflux, despite moving membrane potential closer to action potential threshold, is nevertheless inhibitory. Thus, GABA at the axon initial segment is likely to be inhibitory for action potential initiation independent of whether chloride flows out or into neurons via these receptors.","PeriodicalId":22786,"journal":{"name":"The Journal of Neuroscience","volume":"18 1","pages":"6357 - 6368"},"PeriodicalIF":0.0,"publicationDate":"2023-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85058498","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-12DOI: 10.1101/2022.11.25.517931
Frauke Kraus, Sarah Tune, J. Obleser, Björn Herrmann
Cognitive demand is thought to modulate two often used, but rarely combined, measures: pupil size and neural α (8–12 Hz) oscillatory power. However, it is unclear whether these two measures capture cognitive demand in a similar way under complex audiovisual-task conditions. Here we recorded pupil size and neural α power (using electroencephalography), while human participants of both sexes concurrently performed a visual multiple object-tracking task and an auditory gap detection task. Difficulties of the two tasks were manipulated independent of each other. Participants' performance decreased in accuracy and speed with increasing cognitive demand. Pupil size increased with increasing difficulty for both the auditory and the visual task. In contrast, α power showed diverging neural dynamics: parietal α power decreased with increasing difficulty in the visual task, but not with increasing difficulty in the auditory task. Furthermore, independent of task difficulty, within-participant trial-by-trial fluctuations in pupil size were negatively correlated with α power. Difficulty-induced changes in pupil size and α power, however, did not correlate, which is consistent with their different cognitive-demand sensitivities. Overall, the current study demonstrates that the dynamics of the neurophysiological indices of cognitive demand and associated effort are multifaceted and potentially modality-dependent under complex audiovisual-task conditions. SIGNIFICANCE STATEMENT Pupil size and oscillatory α power are associated with cognitive demand and effort, but their relative sensitivity under complex audiovisual-task conditions is unclear, as is the extent to which they share underlying mechanisms. Using an audiovisual dual-task paradigm, we show that pupil size increases with increasing cognitive demands for both audition and vision. In contrast, changes in oscillatory α power depend on the respective task demands: parietal α power decreases with visual demand but not with auditory task demand. Hence, pupil size and α power show different sensitivity to cognitive demands, perhaps suggesting partly different underlying neural mechanisms.
{"title":"Neural α Oscillations and Pupil Size Differentially Index Cognitive Demand under Competing Audiovisual Task Conditions","authors":"Frauke Kraus, Sarah Tune, J. Obleser, Björn Herrmann","doi":"10.1101/2022.11.25.517931","DOIUrl":"https://doi.org/10.1101/2022.11.25.517931","url":null,"abstract":"Cognitive demand is thought to modulate two often used, but rarely combined, measures: pupil size and neural α (8–12 Hz) oscillatory power. However, it is unclear whether these two measures capture cognitive demand in a similar way under complex audiovisual-task conditions. Here we recorded pupil size and neural α power (using electroencephalography), while human participants of both sexes concurrently performed a visual multiple object-tracking task and an auditory gap detection task. Difficulties of the two tasks were manipulated independent of each other. Participants' performance decreased in accuracy and speed with increasing cognitive demand. Pupil size increased with increasing difficulty for both the auditory and the visual task. In contrast, α power showed diverging neural dynamics: parietal α power decreased with increasing difficulty in the visual task, but not with increasing difficulty in the auditory task. Furthermore, independent of task difficulty, within-participant trial-by-trial fluctuations in pupil size were negatively correlated with α power. Difficulty-induced changes in pupil size and α power, however, did not correlate, which is consistent with their different cognitive-demand sensitivities. Overall, the current study demonstrates that the dynamics of the neurophysiological indices of cognitive demand and associated effort are multifaceted and potentially modality-dependent under complex audiovisual-task conditions. SIGNIFICANCE STATEMENT Pupil size and oscillatory α power are associated with cognitive demand and effort, but their relative sensitivity under complex audiovisual-task conditions is unclear, as is the extent to which they share underlying mechanisms. Using an audiovisual dual-task paradigm, we show that pupil size increases with increasing cognitive demands for both audition and vision. In contrast, changes in oscillatory α power depend on the respective task demands: parietal α power decreases with visual demand but not with auditory task demand. Hence, pupil size and α power show different sensitivity to cognitive demands, perhaps suggesting partly different underlying neural mechanisms.","PeriodicalId":22786,"journal":{"name":"The Journal of Neuroscience","volume":"36 1","pages":"4352 - 4364"},"PeriodicalIF":0.0,"publicationDate":"2023-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86833576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-12DOI: 10.1523/JNEUROSCI.twij.43.15.2023
{"title":"This Week in The Journal","authors":"","doi":"10.1523/JNEUROSCI.twij.43.15.2023","DOIUrl":"https://doi.org/10.1523/JNEUROSCI.twij.43.15.2023","url":null,"abstract":"","PeriodicalId":22786,"journal":{"name":"The Journal of Neuroscience","volume":"65 1","pages":"2630 - 2630"},"PeriodicalIF":0.0,"publicationDate":"2023-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77153169","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-11DOI: 10.1101/2023.04.10.536301
Takaki Yahiro, N. Kataoka, Kazuhiro Nakamura
Thermoregulatory behavior in homeothermic animals is an innate behavior to defend body core temperature from environmental thermal challenges in coordination with autonomous thermoregulatory responses. In contrast to the progress in understanding the central mechanisms of autonomous thermoregulation, those of behavioral thermoregulation remain poorly understood. We have previously shown that the lateral parabrachial nucleus (LPB) mediates cutaneous thermosensory afferent signaling for thermoregulation. To understand the thermosensory neural network for behavioral thermoregulation, in the present study, we investigated the roles of ascending thermosensory pathways from the LPB in avoidance behavior from innocuous heat and cold in male rats. Neuronal tracing revealed two segregated groups of LPB neurons projecting to the median preoptic nucleus (MnPO), a thermoregulatory center (LPB→MnPO neurons), and those projecting to the central amygdaloid nucleus (CeA), a limbic emotion center (LPB→CeA neurons). While LPB→MnPO neurons include separate subgroups activated by heat or cold exposure of rats, LPB→CeA neurons were only activated by cold exposure. By selectively inhibiting LPB→MnPO or LPB→CeA neurons using tetanus toxin light chain or chemogenetic or optogenetic techniques, we found that LPB→MnPO transmission mediates heat avoidance, whereas LPB→CeA transmission contributes to cold avoidance. In vivo electrophysiological experiments showed that skin cooling-evoked thermogenesis in brown adipose tissue requires not only LPB→MnPO neurons but also LPB→CeA neurons, providing a novel insight into the central mechanism of autonomous thermoregulation. Our findings reveal an important framework of central thermosensory afferent pathways to coordinate behavioral and autonomous thermoregulation and to generate the emotions of thermal comfort and discomfort that drive thermoregulatory behavior. SIGNIFICANCE STATEMENT Coordination of behavioral and autonomous thermoregulation is important for maintaining thermal homeostasis in homeothermic animals. However, the central mechanism of thermoregulatory behaviors remains poorly understood. We have previously shown that the lateral parabrachial nucleus (LPB) mediates ascending thermosensory signaling that drives thermoregulatory behavior. In this study, we found that one pathway from the LPB to the median preoptic nucleus mediates heat avoidance, whereas the other pathway from the LPB to the central amygdaloid nucleus is required for cold avoidance. Surprisingly, both pathways are required for skin cooling-evoked thermogenesis in brown adipose tissue, an autonomous thermoregulatory response. This study provides a central thermosensory network that coordinates behavioral and autonomous thermoregulation and generates thermal comfort and discomfort that drive thermoregulatory behavior.
{"title":"Two Ascending Thermosensory Pathways from the Lateral Parabrachial Nucleus That Mediate Behavioral and Autonomous Thermoregulation","authors":"Takaki Yahiro, N. Kataoka, Kazuhiro Nakamura","doi":"10.1101/2023.04.10.536301","DOIUrl":"https://doi.org/10.1101/2023.04.10.536301","url":null,"abstract":"Thermoregulatory behavior in homeothermic animals is an innate behavior to defend body core temperature from environmental thermal challenges in coordination with autonomous thermoregulatory responses. In contrast to the progress in understanding the central mechanisms of autonomous thermoregulation, those of behavioral thermoregulation remain poorly understood. We have previously shown that the lateral parabrachial nucleus (LPB) mediates cutaneous thermosensory afferent signaling for thermoregulation. To understand the thermosensory neural network for behavioral thermoregulation, in the present study, we investigated the roles of ascending thermosensory pathways from the LPB in avoidance behavior from innocuous heat and cold in male rats. Neuronal tracing revealed two segregated groups of LPB neurons projecting to the median preoptic nucleus (MnPO), a thermoregulatory center (LPB→MnPO neurons), and those projecting to the central amygdaloid nucleus (CeA), a limbic emotion center (LPB→CeA neurons). While LPB→MnPO neurons include separate subgroups activated by heat or cold exposure of rats, LPB→CeA neurons were only activated by cold exposure. By selectively inhibiting LPB→MnPO or LPB→CeA neurons using tetanus toxin light chain or chemogenetic or optogenetic techniques, we found that LPB→MnPO transmission mediates heat avoidance, whereas LPB→CeA transmission contributes to cold avoidance. In vivo electrophysiological experiments showed that skin cooling-evoked thermogenesis in brown adipose tissue requires not only LPB→MnPO neurons but also LPB→CeA neurons, providing a novel insight into the central mechanism of autonomous thermoregulation. Our findings reveal an important framework of central thermosensory afferent pathways to coordinate behavioral and autonomous thermoregulation and to generate the emotions of thermal comfort and discomfort that drive thermoregulatory behavior. SIGNIFICANCE STATEMENT Coordination of behavioral and autonomous thermoregulation is important for maintaining thermal homeostasis in homeothermic animals. However, the central mechanism of thermoregulatory behaviors remains poorly understood. We have previously shown that the lateral parabrachial nucleus (LPB) mediates ascending thermosensory signaling that drives thermoregulatory behavior. In this study, we found that one pathway from the LPB to the median preoptic nucleus mediates heat avoidance, whereas the other pathway from the LPB to the central amygdaloid nucleus is required for cold avoidance. Surprisingly, both pathways are required for skin cooling-evoked thermogenesis in brown adipose tissue, an autonomous thermoregulatory response. This study provides a central thermosensory network that coordinates behavioral and autonomous thermoregulation and generates thermal comfort and discomfort that drive thermoregulatory behavior.","PeriodicalId":22786,"journal":{"name":"The Journal of Neuroscience","volume":"37 1","pages":"5221 - 5240"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82842948","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-10DOI: 10.1101/2022.05.24.493162
A. Gogliettino, Sasidhar S. Madugula, Lauren E. Grosberg, R. Vilkhu, Jeff B Brown, Huy Nguyen, Alexandra Kling, P. Hottowy, W. Dąbrowski, A. Sher, A. Litke, E. Chichilnisky
Electrical stimulation of retinal ganglion cells (RGCs) with electronic implants provides rudimentary artificial vision to people blinded by retinal degeneration. However, current devices stimulate indiscriminately and therefore cannot reproduce the intricate neural code of the retina. Recent work has demonstrated more precise activation of RGCs using focal electrical stimulation with multielectrode arrays in the peripheral macaque retina, but it is unclear how effective this can be in the central retina, which is required for high-resolution vision. This work probes the neural code and effectiveness of focal epiretinal stimulation in the central macaque retina, using large-scale electrical recording and stimulation ex vivo. The functional organization, light response properties, and electrical properties of the major RGC types in the central retina were mostly similar to the peripheral retina, with some notable differences in density, kinetics, linearity, spiking statistics, and correlations. The major RGC types could be distinguished by their intrinsic electrical properties. Electrical stimulation targeting parasol cells revealed similar activation thresholds and reduced axon bundle activation in the central retina, but lower stimulation selectivity. Quantitative evaluation of the potential for image reconstruction from electrically evoked parasol cell signals revealed higher overall expected image quality in the central retina. An exploration of inadvertent midget cell activation suggested that it could contribute high spatial frequency noise to the visual signal carried by parasol cells. These results support the possibility of reproducing high-acuity visual signals in the central retina with an epiretinal implant. SIGNIFICANCE STATEMENT Artificial restoration of vision with retinal implants is a major treatment for blindness. However, present-day implants do not provide high-resolution visual perception, in part because they do not reproduce the natural neural code of the retina. Here, we demonstrate the level of visual signal reproduction that is possible with a future implant by examining how accurately responses to electrical stimulation of parasol retinal ganglion cells can convey visual signals. Although the precision of electrical stimulation in the central retina was diminished relative to the peripheral retina, the quality of expected visual signal reconstruction in parasol cells was greater. These findings suggest that visual signals could be restored with high fidelity in the central retina using a future retinal implant.
{"title":"High-Fidelity Reproduction of Visual Signals by Electrical Stimulation in the Central Primate Retina","authors":"A. Gogliettino, Sasidhar S. Madugula, Lauren E. Grosberg, R. Vilkhu, Jeff B Brown, Huy Nguyen, Alexandra Kling, P. Hottowy, W. Dąbrowski, A. Sher, A. Litke, E. Chichilnisky","doi":"10.1101/2022.05.24.493162","DOIUrl":"https://doi.org/10.1101/2022.05.24.493162","url":null,"abstract":"Electrical stimulation of retinal ganglion cells (RGCs) with electronic implants provides rudimentary artificial vision to people blinded by retinal degeneration. However, current devices stimulate indiscriminately and therefore cannot reproduce the intricate neural code of the retina. Recent work has demonstrated more precise activation of RGCs using focal electrical stimulation with multielectrode arrays in the peripheral macaque retina, but it is unclear how effective this can be in the central retina, which is required for high-resolution vision. This work probes the neural code and effectiveness of focal epiretinal stimulation in the central macaque retina, using large-scale electrical recording and stimulation ex vivo. The functional organization, light response properties, and electrical properties of the major RGC types in the central retina were mostly similar to the peripheral retina, with some notable differences in density, kinetics, linearity, spiking statistics, and correlations. The major RGC types could be distinguished by their intrinsic electrical properties. Electrical stimulation targeting parasol cells revealed similar activation thresholds and reduced axon bundle activation in the central retina, but lower stimulation selectivity. Quantitative evaluation of the potential for image reconstruction from electrically evoked parasol cell signals revealed higher overall expected image quality in the central retina. An exploration of inadvertent midget cell activation suggested that it could contribute high spatial frequency noise to the visual signal carried by parasol cells. These results support the possibility of reproducing high-acuity visual signals in the central retina with an epiretinal implant. SIGNIFICANCE STATEMENT Artificial restoration of vision with retinal implants is a major treatment for blindness. However, present-day implants do not provide high-resolution visual perception, in part because they do not reproduce the natural neural code of the retina. Here, we demonstrate the level of visual signal reproduction that is possible with a future implant by examining how accurately responses to electrical stimulation of parasol retinal ganglion cells can convey visual signals. Although the precision of electrical stimulation in the central retina was diminished relative to the peripheral retina, the quality of expected visual signal reconstruction in parasol cells was greater. These findings suggest that visual signals could be restored with high fidelity in the central retina using a future retinal implant.","PeriodicalId":22786,"journal":{"name":"The Journal of Neuroscience","volume":"21 1","pages":"4625 - 4641"},"PeriodicalIF":0.0,"publicationDate":"2023-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77771349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-05DOI: 10.1523/JNEUROSCI.twij.43.14.2023
{"title":"This Week in The Journal","authors":"","doi":"10.1523/JNEUROSCI.twij.43.14.2023","DOIUrl":"https://doi.org/10.1523/JNEUROSCI.twij.43.14.2023","url":null,"abstract":"","PeriodicalId":22786,"journal":{"name":"The Journal of Neuroscience","volume":"164 1","pages":"2439 - 2439"},"PeriodicalIF":0.0,"publicationDate":"2023-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86420463","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-03-31DOI: 10.1101/2023.03.31.535014
Siyi Li, Xue-ling Zeng, Zhujun Shao, Qing Yu
Humans constantly receive massive amounts of information, both perceived from the external environment and imagined from the internal world. To function properly, the brain needs to correctly identify the origin of information being processed. Recent work has suggested common neural substrates for perception and imagery. However, it has remained unclear how the brain differentiates between external and internal experiences with shared neural codes. Here we tested this question in human participants (male and female) by systematically investigating the neural processes underlying the generation and maintenance of visual information from voluntary imagery, veridical perception, and illusion. The inclusion of illusion allowed us to differentiate between objective and subjective internality: while illusion has an objectively internal origin and can be viewed as involuntary imagery, it is also subjectively perceived as having an external origin like perception. Combining fMRI, eye-tracking, multivariate decoding, and encoding approaches, we observed superior orientation representations in parietal cortex during imagery compared with perception, and conversely in early visual cortex. This imagery dominance gradually developed along a posterior-to-anterior cortical hierarchy from early visual to parietal cortex, emerged in the early epoch of imagery and sustained into the delay epoch, and persisted across varied imagined contents. Moreover, representational strength of illusion was more comparable to imagery in early visual cortex, but more comparable to perception in parietal cortex, suggesting content-specific representations in parietal cortex differentiate between subjectively internal and external experiences, as opposed to early visual cortex. These findings together support a domain-general engagement of parietal cortex in internally generated experience. SIGNIFICANCE STATEMENT How does the brain differentiate between imagined and perceived experiences? Combining fMRI, eye-tracking, multivariate decoding, and encoding approaches, the current study revealed enhanced stimulus-specific representations in visual imagery originating from parietal cortex, supporting the subjective experience of imagery. This neural principle was further validated by evidence from visual illusion, wherein illusion resembled perception and imagery at different levels of cortical hierarchy. Our findings provide direct evidence for the critical role of parietal cortex as a domain-general region for content-specific imagery, and offer new insights into the neural mechanisms underlying the differentiation between subjectively internal and external experiences.
{"title":"Neural Representations in Visual and Parietal Cortex Differentiate between Imagined, Perceived, and Illusory Experiences","authors":"Siyi Li, Xue-ling Zeng, Zhujun Shao, Qing Yu","doi":"10.1101/2023.03.31.535014","DOIUrl":"https://doi.org/10.1101/2023.03.31.535014","url":null,"abstract":"Humans constantly receive massive amounts of information, both perceived from the external environment and imagined from the internal world. To function properly, the brain needs to correctly identify the origin of information being processed. Recent work has suggested common neural substrates for perception and imagery. However, it has remained unclear how the brain differentiates between external and internal experiences with shared neural codes. Here we tested this question in human participants (male and female) by systematically investigating the neural processes underlying the generation and maintenance of visual information from voluntary imagery, veridical perception, and illusion. The inclusion of illusion allowed us to differentiate between objective and subjective internality: while illusion has an objectively internal origin and can be viewed as involuntary imagery, it is also subjectively perceived as having an external origin like perception. Combining fMRI, eye-tracking, multivariate decoding, and encoding approaches, we observed superior orientation representations in parietal cortex during imagery compared with perception, and conversely in early visual cortex. This imagery dominance gradually developed along a posterior-to-anterior cortical hierarchy from early visual to parietal cortex, emerged in the early epoch of imagery and sustained into the delay epoch, and persisted across varied imagined contents. Moreover, representational strength of illusion was more comparable to imagery in early visual cortex, but more comparable to perception in parietal cortex, suggesting content-specific representations in parietal cortex differentiate between subjectively internal and external experiences, as opposed to early visual cortex. These findings together support a domain-general engagement of parietal cortex in internally generated experience. SIGNIFICANCE STATEMENT How does the brain differentiate between imagined and perceived experiences? Combining fMRI, eye-tracking, multivariate decoding, and encoding approaches, the current study revealed enhanced stimulus-specific representations in visual imagery originating from parietal cortex, supporting the subjective experience of imagery. This neural principle was further validated by evidence from visual illusion, wherein illusion resembled perception and imagery at different levels of cortical hierarchy. Our findings provide direct evidence for the critical role of parietal cortex as a domain-general region for content-specific imagery, and offer new insights into the neural mechanisms underlying the differentiation between subjectively internal and external experiences.","PeriodicalId":22786,"journal":{"name":"The Journal of Neuroscience","volume":"44 1","pages":"6508 - 6524"},"PeriodicalIF":0.0,"publicationDate":"2023-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85292799","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}