Reverse engineering neural networks from many partial recordings

E. Arani, Sofia Triantafillou, Konrad Paul Kording
{"title":"Reverse engineering neural networks from many partial recordings","authors":"E. Arani, Sofia Triantafillou, Konrad Paul Kording","doi":"10.32470/CCN.2018.1037-0","DOIUrl":null,"url":null,"abstract":"Much of neuroscience aims at reverse engineering the brain, but we only record a small number of neurons at a time. We do not currently know if reverse engineering the brain requires us to simultaneously record most neurons or if multiple recordings from smaller subsets suffice. This is made even more important by the development of novel techniques that allow recording from selected subsets of neurons, e.g. using optical techniques. To get at this question, we analyze a neural network, trained on the MNIST dataset, using only partial recordings and characterize the dependency of the quality of our reverse engineering on the number of simultaneously recorded \"neurons\". We find that reverse engineering of the nonlinear neural network is meaningfully possible if a sufficiently large number of neurons is simultaneously recorded but that this number can be considerably smaller than the number of neurons. Moreover, recording many times from small random subsets of neurons yields surprisingly good performance. Application in neuroscience suggests to approximate the I/O function of an actual neural system, we need to record from a much larger number of neurons. The kind of scaling analysis we perform here can, and arguably should be used to calibrate approaches that can dramatically scale up the size of recorded data sets in neuroscience.","PeriodicalId":298664,"journal":{"name":"arXiv: Neurons and Cognition","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2019-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv: Neurons and Cognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.32470/CCN.2018.1037-0","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Much of neuroscience aims at reverse engineering the brain, but we only record a small number of neurons at a time. We do not currently know if reverse engineering the brain requires us to simultaneously record most neurons or if multiple recordings from smaller subsets suffice. This is made even more important by the development of novel techniques that allow recording from selected subsets of neurons, e.g. using optical techniques. To get at this question, we analyze a neural network, trained on the MNIST dataset, using only partial recordings and characterize the dependency of the quality of our reverse engineering on the number of simultaneously recorded "neurons". We find that reverse engineering of the nonlinear neural network is meaningfully possible if a sufficiently large number of neurons is simultaneously recorded but that this number can be considerably smaller than the number of neurons. Moreover, recording many times from small random subsets of neurons yields surprisingly good performance. Application in neuroscience suggests to approximate the I/O function of an actual neural system, we need to record from a much larger number of neurons. The kind of scaling analysis we perform here can, and arguably should be used to calibrate approaches that can dramatically scale up the size of recorded data sets in neuroscience.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
从许多部分录音中逆向工程神经网络
许多神经科学的目标是对大脑进行逆向工程,但我们一次只能记录一小部分神经元。我们目前还不知道,大脑逆向工程是否需要我们同时记录大多数神经元,还是从较小的子集中进行多次记录就足够了。新技术的发展使这一点变得更加重要,新技术允许从选定的神经元子集中进行记录,例如使用光学技术。为了解决这个问题,我们分析了一个在MNIST数据集上训练的神经网络,只使用部分记录,并描述了我们的逆向工程质量对同时记录的“神经元”数量的依赖关系。我们发现,如果同时记录足够多的神经元,那么非线性神经网络的逆向工程是有意义的,但这个数量可能比神经元的数量要小得多。此外,对小的随机神经元子集进行多次记录会产生令人惊讶的良好性能。在神经科学中的应用表明,为了近似实际神经系统的I/O功能,我们需要记录更多数量的神经元。我们在这里进行的这种规模分析可以,而且可以说应该用于校准可以显着扩大神经科学中记录数据集规模的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Phase-amplitude coupling in neuronal oscillator networks Quality of internal representation shapes learning performance in feedback neural networks Generalisation of neuronal excitability allows for the identification of an excitability change parameter that links to an experimentally measurable value Short term memory by transient oscillatory dynamics in recurrent neural networks Predicting brain evoked response to external stimuli from temporal correlations of spontaneous activity
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1