{"title":"Spike-Rate Perceptrons","authors":"Xuyan Xiang, Yingchun Deng, Xiangqun Yang","doi":"10.1109/ICNC.2008.556","DOIUrl":null,"url":null,"abstract":"According to the diffusion approximation, we present a more biologically plausible so-called spike-rate perceptron based on IF model with renewal process inputs, which employs both first and second statistical representation, i.e. the means, variances and correlations of the synaptic input. We first identify the input-output relationship of the spike-rate model and apply an error minimization technique to train the model. We then show that it is possible to train these networks with a mathematically derived learning rule. We show through various examples that such perceptron, even a single neuron, is able to perform various complex non-linear tasks like the XOR problem. Here our perceptrons offer a significant advantage over classical models, in that they include both the mean and the variance of the input signal. Our ultimate purpose is to open up the possibility of carrying out a random computation in neuronal networks, by introducing second order statistics in computations.","PeriodicalId":6404,"journal":{"name":"2008 Fourth International Conference on Natural Computation","volume":"119 1","pages":"326-333"},"PeriodicalIF":0.0000,"publicationDate":"2008-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 Fourth International Conference on Natural Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNC.2008.556","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
According to the diffusion approximation, we present a more biologically plausible so-called spike-rate perceptron based on IF model with renewal process inputs, which employs both first and second statistical representation, i.e. the means, variances and correlations of the synaptic input. We first identify the input-output relationship of the spike-rate model and apply an error minimization technique to train the model. We then show that it is possible to train these networks with a mathematically derived learning rule. We show through various examples that such perceptron, even a single neuron, is able to perform various complex non-linear tasks like the XOR problem. Here our perceptrons offer a significant advantage over classical models, in that they include both the mean and the variance of the input signal. Our ultimate purpose is to open up the possibility of carrying out a random computation in neuronal networks, by introducing second order statistics in computations.