{"title":"Ants integrate proprioception as well as visual context and efference copies to make robust predictions","authors":"Océane Dauzere-Peres, Antoine Wystrach","doi":"10.1038/s41467-024-53856-4","DOIUrl":null,"url":null,"abstract":"<p>Forward models are mechanisms enabling an agent to predict the sensory outcomes of its actions. They can be implemented through efference copies: copies of motor signals inhibiting the expected sensory stimulation, literally canceling the perceptual outcome of the predicted action. In insects, efference copies are known to modulate optic flow detection for flight control in flies. Here we investigate whether forward models account for the detection of optic flow in walking ants, and how the latter is integrated for locomotion control. We mounted <i>Cataglyphis velox</i> ants in a virtual reality setup and manipulated the relationship between the ants’ movements and the optic flow perceived. Our results show that ants compute predictions of the optic flow expected according to their own movements. However, the prediction is not solely based on efference copies, but involves proprioceptive feedbacks and is fine-tuned by the panorama’s visual structure. Mismatches between prediction and perception are computed for each eye, and error signals are integrated to adjust locomotion through the modulation of internal oscillators. Our work reveals that insects’ forward models are non-trivial and compute predictions based on multimodal information.</p>","PeriodicalId":19066,"journal":{"name":"Nature Communications","volume":"260 1","pages":""},"PeriodicalIF":14.7000,"publicationDate":"2024-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature Communications","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1038/s41467-024-53856-4","RegionNum":1,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Forward models are mechanisms enabling an agent to predict the sensory outcomes of its actions. They can be implemented through efference copies: copies of motor signals inhibiting the expected sensory stimulation, literally canceling the perceptual outcome of the predicted action. In insects, efference copies are known to modulate optic flow detection for flight control in flies. Here we investigate whether forward models account for the detection of optic flow in walking ants, and how the latter is integrated for locomotion control. We mounted Cataglyphis velox ants in a virtual reality setup and manipulated the relationship between the ants’ movements and the optic flow perceived. Our results show that ants compute predictions of the optic flow expected according to their own movements. However, the prediction is not solely based on efference copies, but involves proprioceptive feedbacks and is fine-tuned by the panorama’s visual structure. Mismatches between prediction and perception are computed for each eye, and error signals are integrated to adjust locomotion through the modulation of internal oscillators. Our work reveals that insects’ forward models are non-trivial and compute predictions based on multimodal information.
期刊介绍:
Nature Communications, an open-access journal, publishes high-quality research spanning all areas of the natural sciences. Papers featured in the journal showcase significant advances relevant to specialists in each respective field. With a 2-year impact factor of 16.6 (2022) and a median time of 8 days from submission to the first editorial decision, Nature Communications is committed to rapid dissemination of research findings. As a multidisciplinary journal, it welcomes contributions from biological, health, physical, chemical, Earth, social, mathematical, applied, and engineering sciences, aiming to highlight important breakthroughs within each domain.