首页 > 最新文献

Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.最新文献

英文 中文
Mapping from the spike domain to the rate-based domain 从峰值域到基于速率域的映射
G. Hernández, P. Munro, J. Rubin
The dependence of synaptic plasticity on both presynaptic activity and postsynaptic activity, as postulated by Hebb, has been clearly and repeatedly demonstrated in the laboratory. Traditionally, "activity" has been measured by counting spikes in a short time window to get an average firing rate. Recent experiments reveal functional synaptic changes that depend on the precise timing of individual pairs of spikes (one presynaptic and one postsynaptic). Here, the emergence of rate-based learning rules from spike-based dependencies is introduced, through the idea of a rate map in synaptic weight space.
正如Hebb所假设的那样,突触可塑性对突触前活动和突触后活动的依赖性已经在实验室中得到了清晰和反复的证明。传统上,“活动”是通过计算短时间窗口内的峰值来测量的,以获得平均触发率。最近的实验揭示了功能性突触的变化依赖于单个成对尖峰(一个突触前和一个突触后)的精确时间。在这里,通过突触权重空间中的速率映射的思想,介绍了从基于峰值的依赖关系中出现的基于速率的学习规则。
{"title":"Mapping from the spike domain to the rate-based domain","authors":"G. Hernández, P. Munro, J. Rubin","doi":"10.1109/ICONIP.2002.1198982","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1198982","url":null,"abstract":"The dependence of synaptic plasticity on both presynaptic activity and postsynaptic activity, as postulated by Hebb, has been clearly and repeatedly demonstrated in the laboratory. Traditionally, \"activity\" has been measured by counting spikes in a short time window to get an average firing rate. Recent experiments reveal functional synaptic changes that depend on the precise timing of individual pairs of spikes (one presynaptic and one postsynaptic). Here, the emergence of rate-based learning rules from spike-based dependencies is introduced, through the idea of a rate map in synaptic weight space.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"25 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122616089","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Convergence of the symmetrical FastICA algorithm 对称FastICA算法的收敛性
E. Oja
The FastICA algorithm is one of the most popular methods to solve problems in independent component analysis (ICA) and blind source separation. It has been shown experimentally that it outperforms most of the commonly used ICA algorithms in convergence speed. A rigorous convergence analysis has been presented only for the so-called one-unit case, in which just one of the rows of the separating matrix is considered. However, in the FastICA algorithm, there is also an explicit normalization step, and it may be questioned whether the extra rotation caused by the normalization will effect the convergence speed. The purpose of this paper is to show that this is not the case and the good convergence properties of the one-unit case are also shared by the full algorithm with symmetrical normalization.
FastICA算法是解决独立分量分析(ICA)和盲源分离问题最常用的方法之一。实验表明,它在收敛速度上优于大多数常用的ICA算法。一个严格的收敛分析已经提出了所谓的单单元情况下,其中只考虑分离矩阵的一行。然而,在FastICA算法中,也有一个明确的归一化步骤,可能会有人质疑归一化带来的额外旋转是否会影响收敛速度。本文的目的是证明这种情况并非如此,并且对称归一化的完整算法也具有单单元情况下良好的收敛性。
{"title":"Convergence of the symmetrical FastICA algorithm","authors":"E. Oja","doi":"10.1109/ICONIP.2002.1202844","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202844","url":null,"abstract":"The FastICA algorithm is one of the most popular methods to solve problems in independent component analysis (ICA) and blind source separation. It has been shown experimentally that it outperforms most of the commonly used ICA algorithms in convergence speed. A rigorous convergence analysis has been presented only for the so-called one-unit case, in which just one of the rows of the separating matrix is considered. However, in the FastICA algorithm, there is also an explicit normalization step, and it may be questioned whether the extra rotation caused by the normalization will effect the convergence speed. The purpose of this paper is to show that this is not the case and the good convergence properties of the one-unit case are also shared by the full algorithm with symmetrical normalization.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131121839","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 48
An experimental comparison of recurrent neural network for natural language production 递归神经网络用于自然语言生成的实验比较
H. Nakagama, S. Tanaka
We study the performance of three types of recurrent neural networks (RNN) for the production of natural language sentences: Simple Recurrent Networks (SRN), Back-Propagation Through Time (BPTT) and Sequential Recursive Auto-Associative Memory (SRAAM). We used simple and complex grammars to compare the ability of learning and being scaled up. Among them, SRAAM is found to have highest performance of training and producing fairly complex and long sentences.
我们研究了三种类型的递归神经网络(RNN)在自然语言句子生成中的性能:简单递归网络(SRN)、时间反向传播(BPTT)和顺序递归自关联记忆(SRAAM)。我们用简单和复杂的语法来比较学习和扩展的能力。其中,SRAAM在训练和生成相当复杂和较长的句子方面表现最好。
{"title":"An experimental comparison of recurrent neural network for natural language production","authors":"H. Nakagama, S. Tanaka","doi":"10.1109/ICONIP.2002.1198155","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1198155","url":null,"abstract":"We study the performance of three types of recurrent neural networks (RNN) for the production of natural language sentences: Simple Recurrent Networks (SRN), Back-Propagation Through Time (BPTT) and Sequential Recursive Auto-Associative Memory (SRAAM). We used simple and complex grammars to compare the ability of learning and being scaled up. Among them, SRAAM is found to have highest performance of training and producing fairly complex and long sentences.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131460286","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Organization of inhibitory synaptic circuits in layer 4 of ferret visual cortex related to direction preference maps 雪貂视觉皮层第4层与方向偏好图相关的抑制性突触回路的组织
B. Roerig, B. Chen, J. Kao
Simple cells in layer 4 of the primary visual cortex are the first neurons in the visual pathway showing orientation and direction selective responses. The precise role of intracortical excitatory and inhibitory connections in generating these properties is still unclear. Intracortical inhibitory processes have been shown to be crucial to the generation of direction selective responses. In vivo, excitatory and inhibitory layer 4 cells differ in their receptive field properties: excitatory (regular spiking) neurons are orientation- and direction selective whereas inhibitory (fast spiking) neurons are orientation-, but poorly direction tuned. This difference in direction tuning could be due to differences in intracortical inhibitory synaptic input patterns. To address this question we have optically recorded orientation and direction maps from ferret primary visual cortex. Subsequently the imaged brain region was removed and tangential slices prepared. Whole cell patch clamp recordings from individual layer 4 neurons were done and synaptic inputs were scanned by local photolysis of caged glutamate. Postsynaptic cells were filled with biocytin and histological sections were aligned with the synaptic input maps and the optical images obtained in vivo to determine the spatial distribution of presynaptic inputs. The majority (68%) of excitatory inputs to both spiny (excitatory) and aspiny (inhibitory) stellate cells originated from cortical regions preferring the same orientation and direction as the postsynaptic cell. However, the inhibitory input patterns were significantly different for the two cell populations: excitatory layer 4 cells received two populations of inhibitory inputs, about 50% originated in iso-direction domains whereas the remaining inputs originated in cortical regions preferring the opposite direction of stimulus motion. This indicates that specific inhibitory connections originating in regions tuned to the opposite direction are important for direction tuning of cortical neurons and that differences in response properties in different populations of cortical neurons might be explained by their different intracortical connectivity patterns.
初级视觉皮层第4层的简单细胞是视觉通路中最早表现出定向和方向选择反应的神经元。皮层内兴奋性和抑制性连接在产生这些特性中的确切作用尚不清楚。皮质内抑制过程已被证明是至关重要的方向选择反应的产生。在体内,兴奋性和抑制性第4层细胞的感受野特性是不同的:兴奋性(有规律的尖峰)神经元是定向和方向选择性的,而抑制性(快速尖峰)神经元是定向的,但方向调谐性差。这种方向调谐的差异可能是由于皮层内抑制性突触输入模式的差异。为了解决这个问题,我们用光学记录了雪貂初级视觉皮层的方向和方向图。随后,将成像的脑区切除,准备切向切片。对单个第4层神经元进行全细胞膜片钳记录,并通过笼内谷氨酸局部光解扫描突触输入。在突触后细胞中填充生物细胞素,并将组织切片与突触输入图和体内获得的光学图像对齐,以确定突触前输入的空间分布。大多数(68%)对棘状(兴奋性)和棘状(抑制性)星状细胞的兴奋性输入来自与突触后细胞相同的取向和方向的皮层区域。然而,两种细胞群的抑制性输入模式明显不同:兴奋层4细胞接受两组抑制性输入,约50%来自同方向域,而其余输入来自皮层区域,倾向于相反方向的刺激运动。这表明,特定的抑制性连接源自于调谐到相反方向的区域,这对于皮质神经元的方向调谐是重要的,并且在不同的皮质神经元群体中,反应特性的差异可能是由它们不同的皮质内连接模式来解释的。
{"title":"Organization of inhibitory synaptic circuits in layer 4 of ferret visual cortex related to direction preference maps","authors":"B. Roerig, B. Chen, J. Kao","doi":"10.1109/ICONIP.2002.1202122","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202122","url":null,"abstract":"Simple cells in layer 4 of the primary visual cortex are the first neurons in the visual pathway showing orientation and direction selective responses. The precise role of intracortical excitatory and inhibitory connections in generating these properties is still unclear. Intracortical inhibitory processes have been shown to be crucial to the generation of direction selective responses. In vivo, excitatory and inhibitory layer 4 cells differ in their receptive field properties: excitatory (regular spiking) neurons are orientation- and direction selective whereas inhibitory (fast spiking) neurons are orientation-, but poorly direction tuned. This difference in direction tuning could be due to differences in intracortical inhibitory synaptic input patterns. To address this question we have optically recorded orientation and direction maps from ferret primary visual cortex. Subsequently the imaged brain region was removed and tangential slices prepared. Whole cell patch clamp recordings from individual layer 4 neurons were done and synaptic inputs were scanned by local photolysis of caged glutamate. Postsynaptic cells were filled with biocytin and histological sections were aligned with the synaptic input maps and the optical images obtained in vivo to determine the spatial distribution of presynaptic inputs. The majority (68%) of excitatory inputs to both spiny (excitatory) and aspiny (inhibitory) stellate cells originated from cortical regions preferring the same orientation and direction as the postsynaptic cell. However, the inhibitory input patterns were significantly different for the two cell populations: excitatory layer 4 cells received two populations of inhibitory inputs, about 50% originated in iso-direction domains whereas the remaining inputs originated in cortical regions preferring the opposite direction of stimulus motion. This indicates that specific inhibitory connections originating in regions tuned to the opposite direction are important for direction tuning of cortical neurons and that differences in response properties in different populations of cortical neurons might be explained by their different intracortical connectivity patterns.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131492138","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A novel artificial neural network trained using evolutionary algorithms for reinforcement learning 一种新的人工神经网络训练使用进化算法强化学习
A. Reddipogu, G. Maxwell, C. MacLeod, M. Simpson
This paper discusses the development of a novel pattern recognition system using artificial neural networks (ANNs) and evolutionary algorithms for reinforcement learning (EARL). The network is based on neuronal interactions involved in identification of prey and predator in toads. The distributed neural network (DNN) is capable of recognizing and classifying various features. The lateral inhibition between the output neurons helps the network in the classification process - similar to the gate in gating network. The results obtained are compared with standard neural network architectures and learning algorithms.
本文讨论了一种利用人工神经网络(ann)和进化算法进行强化学习(EARL)的新型模式识别系统的开发。这个网络是基于蟾蜍识别猎物和捕食者的神经元相互作用。分布式神经网络(DNN)具有识别和分类各种特征的能力。输出神经元之间的横向抑制有助于网络的分类过程——类似于门控网络中的门。所得结果与标准神经网络架构和学习算法进行了比较。
{"title":"A novel artificial neural network trained using evolutionary algorithms for reinforcement learning","authors":"A. Reddipogu, G. Maxwell, C. MacLeod, M. Simpson","doi":"10.1109/ICONIP.2002.1199013","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1199013","url":null,"abstract":"This paper discusses the development of a novel pattern recognition system using artificial neural networks (ANNs) and evolutionary algorithms for reinforcement learning (EARL). The network is based on neuronal interactions involved in identification of prey and predator in toads. The distributed neural network (DNN) is capable of recognizing and classifying various features. The lateral inhibition between the output neurons helps the network in the classification process - similar to the gate in gating network. The results obtained are compared with standard neural network architectures and learning algorithms.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127593289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Storage and recall of dynamical patterns in neural network models of hippocampus 海马神经网络模型中动态模式的存储和回忆
T. Horiguchi, H. Yokoyama
We propose a four-layered neural network model and a five-layered neural network model for the hippocampal system by extending the three-layered model given by Araki and Aihara (1998), in order to introduce an effect of auto-recurrent connections in CA3 and also an effect of cholinergic modulation in CA1 and CA3 from the medial septum. We investigate the storage and the recall of dynamical patterns for the proposed models with or without inhibitory connections in CA3. We clarify two different sequential patterns that the storage and the recall succeeded for the proposed models, but not for the original three-layered model. We discuss an effect of acetylcholine to neurons in CA1 and CA3 transmitted from the medial septum.
我们通过扩展Araki和Aihara(1998)给出的三层神经网络模型,提出了海马系统的四层神经网络模型和五层神经网络模型,以引入CA3中自循环连接的影响,以及来自中隔的CA1和CA3中的胆碱能调节的影响。我们研究了在CA3中有或没有抑制连接的模型的动态模式的存储和回忆。我们澄清了两种不同的顺序模式,即存储和召回在提出的模型中成功,而在原始的三层模型中不成功。我们讨论了乙酰胆碱对内侧隔膜传递的CA1和CA3神经元的影响。
{"title":"Storage and recall of dynamical patterns in neural network models of hippocampus","authors":"T. Horiguchi, H. Yokoyama","doi":"10.1109/ICONIP.2002.1202190","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202190","url":null,"abstract":"We propose a four-layered neural network model and a five-layered neural network model for the hippocampal system by extending the three-layered model given by Araki and Aihara (1998), in order to introduce an effect of auto-recurrent connections in CA3 and also an effect of cholinergic modulation in CA1 and CA3 from the medial septum. We investigate the storage and the recall of dynamical patterns for the proposed models with or without inhibitory connections in CA3. We clarify two different sequential patterns that the storage and the recall succeeded for the proposed models, but not for the original three-layered model. We discuss an effect of acetylcholine to neurons in CA1 and CA3 transmitted from the medial septum.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"248 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133515394","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
On information characteristics of sparsely encoded binary auto-associative memory 稀疏编码二进制自联想存储器的信息特性研究
A. Frolov, D. Rachkovskij, D. Húsek
A sparsely encoded Willshaw-like attractor neural network based on binary Hebbian synapses is investigated analytically and by computer simulations. A special inhibition mechanism which supports a constant number of active neurons at each time step is used. Informational capacity and size of attraction basins are evaluated for the single-step and the Gibson-Robinson approximations, as well as for experimental results.
本文研究了一种基于二进制Hebbian突触的稀疏编码类威尔肖吸引子神经网络。采用一种特殊的抑制机制,在每个时间步长支持恒定数量的活动神经元。对单步法和Gibson-Robinson近似法以及实验结果进行了信息容量和大小的评价。
{"title":"On information characteristics of sparsely encoded binary auto-associative memory","authors":"A. Frolov, D. Rachkovskij, D. Húsek","doi":"10.1109/ICONIP.2002.1202168","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202168","url":null,"abstract":"A sparsely encoded Willshaw-like attractor neural network based on binary Hebbian synapses is investigated analytically and by computer simulations. A special inhibition mechanism which supports a constant number of active neurons at each time step is used. Informational capacity and size of attraction basins are evaluated for the single-step and the Gibson-Robinson approximations, as well as for experimental results.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"66 15","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133587819","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Fuzzy mean point clustering neural network 模糊均值点聚类神经网络
P. Patil, U. Kulkarni, T. Sontakke
Fuzzy mean point clustering neural network (FMPCNN) is proposed with its learning algorithm, which utilizes fuzzy sets as pattern clusters. The performance of FMPCNN when verified with Fisher Iris data, it is found superior to Simpson's fuzzy min-max neural network and fuzzy hyperline segment clustering neural network (FHLSCNN) proposed by Kulkarni and Sontakke.
提出了模糊均值点聚类神经网络(FMPCNN)及其学习算法,该算法利用模糊集作为模式聚类。通过Fisher虹膜数据验证,FMPCNN的性能优于Simpson的模糊最小-最大神经网络和Kulkarni和Sontakke提出的模糊超线段聚类神经网络(FHLSCNN)。
{"title":"Fuzzy mean point clustering neural network","authors":"P. Patil, U. Kulkarni, T. Sontakke","doi":"10.1109/ICONIP.2002.1198184","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1198184","url":null,"abstract":"Fuzzy mean point clustering neural network (FMPCNN) is proposed with its learning algorithm, which utilizes fuzzy sets as pattern clusters. The performance of FMPCNN when verified with Fisher Iris data, it is found superior to Simpson's fuzzy min-max neural network and fuzzy hyperline segment clustering neural network (FHLSCNN) proposed by Kulkarni and Sontakke.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133763778","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
An approach to control aging rate of neural networks under adaptation to gradually changing context 一种适应逐渐变化环境的神经网络老化速率控制方法
T. Tanprasert, T. Kripruksawan
The paper presents a decayed prior sampling algorithm for integrating the existing knowledge of a supervised learning neural networks with the new training data. The algorithm allows the existing knowledge to age out in slow rate as a neural network is gradually retrained with consecutive sets of new samples, resembling the change of application locality under a consistent environment. The experiments are performed on 2-dimensional partitions problem and the results convincingly confirm the effectiveness of the technique.
提出了一种将已有的监督学习神经网络知识与新的训练数据相结合的衰减先验抽样算法。该算法允许现有知识以缓慢的速度老化,因为神经网络是用连续的新样本集逐渐重新训练的,类似于在一致的环境下应用局部的变化。在二维分割问题上进行了实验,结果令人信服地证实了该方法的有效性。
{"title":"An approach to control aging rate of neural networks under adaptation to gradually changing context","authors":"T. Tanprasert, T. Kripruksawan","doi":"10.1109/ICONIP.2002.1202154","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202154","url":null,"abstract":"The paper presents a decayed prior sampling algorithm for integrating the existing knowledge of a supervised learning neural networks with the new training data. The algorithm allows the existing knowledge to age out in slow rate as a neural network is gradually retrained with consecutive sets of new samples, resembling the change of application locality under a consistent environment. The experiments are performed on 2-dimensional partitions problem and the results convincingly confirm the effectiveness of the technique.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133903943","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Evolving connectionist systems for adaptive learning and knowledge discovery: methods, tools, applications 自适应学习和知识发现的不断发展的连接主义系统:方法、工具和应用
N. Kasabov
The paper describes what evolving processes are and presents a computational model called evolving connectionist systems (ECOS). The model is based on principles from both brain organization and genetics. The applicability of the model for dynamic modeling and knowledge discovery in the areas of brain study, bioinformatics, speech and language learning, adaptive control and adaptive decision support is discussed.
本文描述了什么是进化过程,并提出了一个称为进化连接系统(ECOS)的计算模型。这个模型是基于大脑组织和遗传学的原理。讨论了该模型在大脑研究、生物信息学、语音和语言学习、自适应控制和自适应决策支持等领域的动态建模和知识发现的适用性。
{"title":"Evolving connectionist systems for adaptive learning and knowledge discovery: methods, tools, applications","authors":"N. Kasabov","doi":"10.1109/ICONIP.2002.1198126","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1198126","url":null,"abstract":"The paper describes what evolving processes are and presents a computational model called evolving connectionist systems (ECOS). The model is based on principles from both brain organization and genetics. The applicability of the model for dynamic modeling and knowledge discovery in the areas of brain study, bioinformatics, speech and language learning, adaptive control and adaptive decision support is discussed.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"190 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133971745","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
期刊
Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1