A general-purpose organic gel computer that learns by itself

Pathik Sahoo, Pushpendra Singh, Komal Saxena, Subrata Ghosh, Ravindra P. Singh, R. Benosman, Jonathan P. Hill, Tomonobu Nakayama, A. Bandyopadhyay
{"title":"A general-purpose organic gel computer that learns by itself","authors":"Pathik Sahoo, Pushpendra Singh, Komal Saxena, Subrata Ghosh, Ravindra P. Singh, R. Benosman, Jonathan P. Hill, Tomonobu Nakayama, A. Bandyopadhyay","doi":"10.1088/2634-4386/ad0fec","DOIUrl":null,"url":null,"abstract":"To build energy minimized superstructures, self-assembling molecules explore astronomical options, colliding ∼109 molecules s−1. Thus far, no computer has used it fully to optimize choices and execute advanced computational theories only by synthesizing supramolecules. To realize it, first, we remotely re-wrote the problem in a language that supramolecular synthesis comprehends. Then, all-chemical neural network synthesizes one helical nanowire for one periodic event. These nanowires self-assemble into gel fibers mapping intricate relations between periodic events in any-data-type, the output is read instantly from optical hologram. Problem-wise, self-assembling layers or neural network depth is optimized to chemically simulate theories discovering invariants for learning. Subsequently, synthesis alone solves classification, feature learning problems instantly with single shot training. Reusable gel begins general-purpose computing that would chemically invent suitable models for problem-specific unsupervised learning. Irrespective of complexity, keeping fixed computing time and power, gel promises a toxic-hardware-free world. One sentence summary: fractally coupled deep learning networks revisits Rosenblatt’s 1950s theorem on deep learning network.","PeriodicalId":198030,"journal":{"name":"Neuromorphic Computing and Engineering","volume":"48 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neuromorphic Computing and Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/2634-4386/ad0fec","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

To build energy minimized superstructures, self-assembling molecules explore astronomical options, colliding ∼109 molecules s−1. Thus far, no computer has used it fully to optimize choices and execute advanced computational theories only by synthesizing supramolecules. To realize it, first, we remotely re-wrote the problem in a language that supramolecular synthesis comprehends. Then, all-chemical neural network synthesizes one helical nanowire for one periodic event. These nanowires self-assemble into gel fibers mapping intricate relations between periodic events in any-data-type, the output is read instantly from optical hologram. Problem-wise, self-assembling layers or neural network depth is optimized to chemically simulate theories discovering invariants for learning. Subsequently, synthesis alone solves classification, feature learning problems instantly with single shot training. Reusable gel begins general-purpose computing that would chemically invent suitable models for problem-specific unsupervised learning. Irrespective of complexity, keeping fixed computing time and power, gel promises a toxic-hardware-free world. One sentence summary: fractally coupled deep learning networks revisits Rosenblatt’s 1950s theorem on deep learning network.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
能自我学习的通用有机凝胶计算机
为了构建能量最小化的超结构,自组装分子以每秒 ∼109 个分子的速度进行碰撞,探索天文数字般的选择。迄今为止,还没有哪台计算机完全利用它来优化选择,仅通过合成超分子来执行高级计算理论。为了实现它,首先,我们用超分子合成所能理解的语言远程重写了问题。然后,全化学神经网络为一个周期性事件合成一根螺旋纳米线。这些纳米线自组装成凝胶纤维,以任何数据类型映射周期性事件之间的复杂关系,并从光学全息图中即时读取输出结果。从问题的角度来看,自组装层或神经网络深度经过优化,可以通过化学模拟理论发现学习的不变性。随后,仅靠合成就能通过单次训练立即解决分类和特征学习问题。可重复使用的凝胶体开启了通用计算的先河,它能以化学方式为特定问题的无监督学习发明合适的模型。无论复杂程度如何,在保持固定计算时间和功率的情况下,凝胶有望创造一个无毒硬件的世界。一句话总结:分形耦合深度学习网络重温了罗森布拉特 20 世纪 50 年代关于深度学习网络的定理。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
5.90
自引率
0.00%
发文量
0
期刊最新文献
Difficulties and approaches in enabling learning-in-memory using crossbar arrays of memristors A liquid optical memristor using photochromic effect and capillary effect Tissue-like interfacing of planar electrochemical organic neuromorphic devices Implementation of two-step gradual reset scheme for enhancing state uniformity of 2D hBN-based memristors for image processing Modulating short-term and long-term plasticity of polymer-based artificial synapses for neuromorphic computing and beyond
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1