量化存储在突触连接中的信息,而不是神经网络的放电模式。

ArXiv Pub Date : 2024-11-26
Xinhao Fan, Shreesh P Mysore
{"title":"量化存储在突触连接中的信息,而不是神经网络的放电模式。","authors":"Xinhao Fan, Shreesh P Mysore","doi":"","DOIUrl":null,"url":null,"abstract":"<p><p>A cornerstone of our understanding of both biological and artificial neural networks is that they store information in the strengths of connections among the constituent neurons. However, in contrast to the well-established theory for quantifying information encoded by the firing patterns of neural networks, little is known about quantifying information encoded by its synaptic connections. Here, we develop a theoretical framework using continuous Hopfield networks as an exemplar for associative neural networks, and data that follow mixtures of broadly applicable multivariate log-normal distributions. Specifically, we analytically derive the Shannon mutual information between the data and singletons, pairs, triplets, quadruplets, and arbitrary n-tuples of synaptic connections within the network. Our framework corroborates well-established insights about storage capacity of, and distributed coding by, neural firing patterns. Strikingly, it discovers synergistic interactions among synapses, revealing that the information encoded jointly by all the synapses exceeds the 'sum of its parts'. Taken together, this study introduces an interpretable framework for quantitatively understanding information storage in neural networks, one that illustrates the duality of synaptic connectivity and neural population activity in learning and memory.</p>","PeriodicalId":93888,"journal":{"name":"ArXiv","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11623702/pdf/","citationCount":"0","resultStr":"{\"title\":\"Quantifying information stored in synaptic connections rather than in firing patterns of neural networks.\",\"authors\":\"Xinhao Fan, Shreesh P Mysore\",\"doi\":\"\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>A cornerstone of our understanding of both biological and artificial neural networks is that they store information in the strengths of connections among the constituent neurons. However, in contrast to the well-established theory for quantifying information encoded by the firing patterns of neural networks, little is known about quantifying information encoded by its synaptic connections. Here, we develop a theoretical framework using continuous Hopfield networks as an exemplar for associative neural networks, and data that follow mixtures of broadly applicable multivariate log-normal distributions. Specifically, we analytically derive the Shannon mutual information between the data and singletons, pairs, triplets, quadruplets, and arbitrary n-tuples of synaptic connections within the network. Our framework corroborates well-established insights about storage capacity of, and distributed coding by, neural firing patterns. Strikingly, it discovers synergistic interactions among synapses, revealing that the information encoded jointly by all the synapses exceeds the 'sum of its parts'. Taken together, this study introduces an interpretable framework for quantitatively understanding information storage in neural networks, one that illustrates the duality of synaptic connectivity and neural population activity in learning and memory.</p>\",\"PeriodicalId\":93888,\"journal\":{\"name\":\"ArXiv\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-11-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11623702/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ArXiv\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ArXiv","FirstCategoryId":"1085","ListUrlMain":"","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

我们理解生物神经网络和人工神经网络的一个基石是,它们通过组成神经元之间的连接强度来存储信息。然而,与由神经网络发射模式编码的信息量化理论相比,对其突触连接编码的信息量化知之甚少。在这里,我们开发了一个理论框架,使用连续Hopfield网络作为联想神经网络的范例,并遵循广泛适用的多元对数正态分布的混合数据。具体来说,我们分析推导了数据与网络内突触连接的单态、对态、三态、四态以及任意n元组之间的香农互信息。我们的框架证实了关于神经放电模式的存储容量和分布式编码的成熟见解。引人注目的是,它发现了突触之间的协同作用,揭示了所有突触共同编码的信息超过了“部分之和”。综上所述,本研究为定量理解神经网络中的信息存储引入了一个可解释的框架,该框架说明了突触连接和神经群体活动在学习和记忆中的对偶性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Quantifying information stored in synaptic connections rather than in firing patterns of neural networks.

A cornerstone of our understanding of both biological and artificial neural networks is that they store information in the strengths of connections among the constituent neurons. However, in contrast to the well-established theory for quantifying information encoded by the firing patterns of neural networks, little is known about quantifying information encoded by its synaptic connections. Here, we develop a theoretical framework using continuous Hopfield networks as an exemplar for associative neural networks, and data that follow mixtures of broadly applicable multivariate log-normal distributions. Specifically, we analytically derive the Shannon mutual information between the data and singletons, pairs, triplets, quadruplets, and arbitrary n-tuples of synaptic connections within the network. Our framework corroborates well-established insights about storage capacity of, and distributed coding by, neural firing patterns. Strikingly, it discovers synergistic interactions among synapses, revealing that the information encoded jointly by all the synapses exceeds the 'sum of its parts'. Taken together, this study introduces an interpretable framework for quantitatively understanding information storage in neural networks, one that illustrates the duality of synaptic connectivity and neural population activity in learning and memory.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Grade Inflation in Generative Models. A recent evaluation on the performance of LLMs on radiation oncology physics using questions of randomly shuffled options. A Systematic Computational Framework for Practical Identifiability Analysis in Mathematical Models Arising from Biology. Back to the Continuous Attractor. Inferring resource competition in microbial communities from time series.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1