Brains as naturally emerging turing machines

J. Weng
{"title":"Brains as naturally emerging turing machines","authors":"J. Weng","doi":"10.1109/IJCNN.2015.7280838","DOIUrl":null,"url":null,"abstract":"It has been shown that a Developmental Network (DN) can learn any Finite Automaton (FA) [29] but FA is not a general purpose automaton by itself. This theoretical paper presents that the controller of any Turing Machine (TM) is equivalent to an FA. It further models a motivation-free brain - excluding motivation e.g., emotions - as a TM inside a grounded DN - DN with the real world. Unlike a traditional TM, the TM-in-DN uses natural encoding of input and output and uses emergent internal representations. In Artificial Intelligence (AI) there are two major schools, symbolism and connectionism. The theoretical result here implies that the connectionist school is at least as powerful as the symbolic school also in terms of the general-purpose nature of TM. Furthermore, any TM simulated by the DN is grounded and uses natural encoding so that the DN autonomously learns any TM directly from natural world without a need for a human to encode its input and output. This opens the door for the DN to fully autonomously learn any TM, from a human teacher, reading a book, or real world events. The motivated version of DN [31] further enables a DN to go beyond action-supervised learning - so as to learn based on pain-avoidance, pleasure seeking, and novelty seeking [31].","PeriodicalId":6539,"journal":{"name":"2015 International Joint Conference on Neural Networks (IJCNN)","volume":"11 1","pages":"1-8"},"PeriodicalIF":0.0000,"publicationDate":"2015-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2015.7280838","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

It has been shown that a Developmental Network (DN) can learn any Finite Automaton (FA) [29] but FA is not a general purpose automaton by itself. This theoretical paper presents that the controller of any Turing Machine (TM) is equivalent to an FA. It further models a motivation-free brain - excluding motivation e.g., emotions - as a TM inside a grounded DN - DN with the real world. Unlike a traditional TM, the TM-in-DN uses natural encoding of input and output and uses emergent internal representations. In Artificial Intelligence (AI) there are two major schools, symbolism and connectionism. The theoretical result here implies that the connectionist school is at least as powerful as the symbolic school also in terms of the general-purpose nature of TM. Furthermore, any TM simulated by the DN is grounded and uses natural encoding so that the DN autonomously learns any TM directly from natural world without a need for a human to encode its input and output. This opens the door for the DN to fully autonomously learn any TM, from a human teacher, reading a book, or real world events. The motivated version of DN [31] further enables a DN to go beyond action-supervised learning - so as to learn based on pain-avoidance, pleasure seeking, and novelty seeking [31].
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
大脑是自然形成的图灵机
研究表明,发展性网络(DN)可以学习任何有限自动机(FA)[29],但FA本身并不是通用自动机。本文提出任何图灵机(TM)的控制器都等价于一个FA。它进一步模拟了一个没有动机的大脑——不包括动机,例如情绪——作为一个根植于DN中的TM——与现实世界的DN。与传统的TM不同,TM-in- dn使用输入和输出的自然编码,并使用紧急的内部表示。在人工智能(AI)中,有两大流派,象征主义和连接主义。这里的理论结果表明,在TM的通用性方面,连接主义学派至少与符号学派一样强大。此外,DN模拟的任何TM都是基于自然编码的,因此DN可以直接从自然世界中自主学习任何TM,而不需要人类对其输入和输出进行编码。这为DN完全自主地学习任何TM打开了大门,可以从人类老师、阅读书籍或现实世界的事件中学习。动机型DN[31]进一步使DN超越了行动监督式学习,从而基于回避痛苦、寻求快乐和寻求新奇进行学习[31]。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Efficient conformal regressors using bagged neural nets Repeated play of the SVM game as a means of adaptive classification Unit commitment considering multiple charging and discharging scenarios of plug-in electric vehicles High-dimensional function approximation using local linear embedding A label compression coding approach through maximizing dependence between features and labels for multi-label classification
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1