侧向连接提高了简单神经网络学习的通用性

IF 2.7 4区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Neural Computation Pub Date : 2024-03-21 DOI:10.1162/neco_a_01640
Garrett Crutcher
{"title":"侧向连接提高了简单神经网络学习的通用性","authors":"Garrett Crutcher","doi":"10.1162/neco_a_01640","DOIUrl":null,"url":null,"abstract":"To navigate the world around us, neural circuits rapidly adapt to their environment learning generalizable strategies to decode information. When modeling these learning strategies, network models find the optimal solution to satisfy one task condition but fail when introduced to a novel task or even a different stimulus in the same space. In the experiments described in this letter, I investigate the role of lateral gap junctions in learning generalizable strategies to process information. Lateral gap junctions are formed by connexin proteins creating an open pore that allows for direct electrical signaling between two neurons. During neural development, the rate of gap junctions is high, and daughter cells that share similar tuning properties are more likely to be connected by these junctions. Gap junctions are highly plastic and get heavily pruned throughout development. I hypothesize that they mediate generalized learning by imprinting the weighting structure within a layer to avoid overfitting to one task condition. To test this hypothesis, I implemented a feedforward probabilistic neural network mimicking a cortical fast spiking neuron circuit that is heavily involved in movement. Many of these cells are tuned to speeds that I used as the input stimulus for the network to estimate. When training this network using a delta learning rule, both a laterally connected network and an unconnected network can estimate a single speed. However, when asking the network to estimate two or more speeds, alternated in training, an unconnected network either cannot learn speed or optimizes to a singular speed, while the laterally connected network learns the generalizable strategy and can estimate both speeds. These results suggest that lateral gap junctions between neurons enable generalized learning, which may help explain learning differences across life span.","PeriodicalId":54731,"journal":{"name":"Neural Computation","volume":null,"pages":null},"PeriodicalIF":2.7000,"publicationDate":"2024-03-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Lateral Connections Improve Generalizability of Learning in a Simple Neural Network\",\"authors\":\"Garrett Crutcher\",\"doi\":\"10.1162/neco_a_01640\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"To navigate the world around us, neural circuits rapidly adapt to their environment learning generalizable strategies to decode information. When modeling these learning strategies, network models find the optimal solution to satisfy one task condition but fail when introduced to a novel task or even a different stimulus in the same space. In the experiments described in this letter, I investigate the role of lateral gap junctions in learning generalizable strategies to process information. Lateral gap junctions are formed by connexin proteins creating an open pore that allows for direct electrical signaling between two neurons. During neural development, the rate of gap junctions is high, and daughter cells that share similar tuning properties are more likely to be connected by these junctions. Gap junctions are highly plastic and get heavily pruned throughout development. I hypothesize that they mediate generalized learning by imprinting the weighting structure within a layer to avoid overfitting to one task condition. To test this hypothesis, I implemented a feedforward probabilistic neural network mimicking a cortical fast spiking neuron circuit that is heavily involved in movement. Many of these cells are tuned to speeds that I used as the input stimulus for the network to estimate. When training this network using a delta learning rule, both a laterally connected network and an unconnected network can estimate a single speed. However, when asking the network to estimate two or more speeds, alternated in training, an unconnected network either cannot learn speed or optimizes to a singular speed, while the laterally connected network learns the generalizable strategy and can estimate both speeds. These results suggest that lateral gap junctions between neurons enable generalized learning, which may help explain learning differences across life span.\",\"PeriodicalId\":54731,\"journal\":{\"name\":\"Neural Computation\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2024-03-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Computation\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10535096/\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computation","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10535096/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

为了引导我们周围的世界,神经回路会迅速适应环境,学习可通用的信息解码策略。在对这些学习策略进行建模时,网络模型会找到满足某一任务条件的最优解,但当引入新任务甚至同一空间的不同刺激时,网络模型就会失效。在这封信所描述的实验中,我研究了横向间隙连接在学习处理信息的通用策略中的作用。侧向间隙连接是由连接蛋白形成的一个开放孔,它允许两个神经元之间直接进行电信号传递。在神经发育过程中,间隙连接的发生率很高,具有相似调谐特性的子细胞更有可能通过这些连接点连接起来。间隙连接具有高度可塑性,在整个发育过程中会被大量修剪。我的假设是,间隙连接通过印记层内的加权结构来介导泛化学习,以避免过度适应某一任务条件。为了验证这一假设,我模仿皮质快速尖峰神经元回路,建立了一个前馈概率神经网络,该神经网络在很大程度上参与了运动。这些细胞中有许多都被调谐到速度上,我将其作为输入刺激供网络估计。当使用三角学习规则训练该网络时,横向连接的网络和非连接的网络都能估计出单一速度。然而,当要求网络在训练中交替估计两个或更多速度时,未连接的网络要么无法学习速度,要么优化为单一速度,而横向连接的网络则能学习通用策略,并能估计两个速度。这些结果表明,神经元之间的横向间隙连接实现了泛化学习,这可能有助于解释不同生命期的学习差异。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Lateral Connections Improve Generalizability of Learning in a Simple Neural Network
To navigate the world around us, neural circuits rapidly adapt to their environment learning generalizable strategies to decode information. When modeling these learning strategies, network models find the optimal solution to satisfy one task condition but fail when introduced to a novel task or even a different stimulus in the same space. In the experiments described in this letter, I investigate the role of lateral gap junctions in learning generalizable strategies to process information. Lateral gap junctions are formed by connexin proteins creating an open pore that allows for direct electrical signaling between two neurons. During neural development, the rate of gap junctions is high, and daughter cells that share similar tuning properties are more likely to be connected by these junctions. Gap junctions are highly plastic and get heavily pruned throughout development. I hypothesize that they mediate generalized learning by imprinting the weighting structure within a layer to avoid overfitting to one task condition. To test this hypothesis, I implemented a feedforward probabilistic neural network mimicking a cortical fast spiking neuron circuit that is heavily involved in movement. Many of these cells are tuned to speeds that I used as the input stimulus for the network to estimate. When training this network using a delta learning rule, both a laterally connected network and an unconnected network can estimate a single speed. However, when asking the network to estimate two or more speeds, alternated in training, an unconnected network either cannot learn speed or optimizes to a singular speed, while the laterally connected network learns the generalizable strategy and can estimate both speeds. These results suggest that lateral gap junctions between neurons enable generalized learning, which may help explain learning differences across life span.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neural Computation
Neural Computation 工程技术-计算机:人工智能
CiteScore
6.30
自引率
3.40%
发文量
83
审稿时长
3.0 months
期刊介绍: Neural Computation is uniquely positioned at the crossroads between neuroscience and TMCS and welcomes the submission of original papers from all areas of TMCS, including: Advanced experimental design; Analysis of chemical sensor data; Connectomic reconstructions; Analysis of multielectrode and optical recordings; Genetic data for cell identity; Analysis of behavioral data; Multiscale models; Analysis of molecular mechanisms; Neuroinformatics; Analysis of brain imaging data; Neuromorphic engineering; Principles of neural coding, computation, circuit dynamics, and plasticity; Theories of brain function.
期刊最新文献
Associative Learning and Active Inference. Deep Nonnegative Matrix Factorization with Beta Divergences. KLIF: An Optimized Spiking Neuron Unit for Tuning Surrogate Gradient Function. ℓ 1 -Regularized ICA: A Novel Method for Analysis of Task-Related fMRI Data. Latent Space Bayesian Optimization With Latent Data Augmentation for Enhanced Exploration.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1