Vinay Kumar Reddy Chimmula, Lei Zhang, Dhanya Palliath, Abhinay Kumar
{"title":"基于多神经元的数字识别改进脉冲神经网络","authors":"Vinay Kumar Reddy Chimmula, Lei Zhang, Dhanya Palliath, Abhinay Kumar","doi":"10.1109/iCAST51195.2020.9319475","DOIUrl":null,"url":null,"abstract":"For more than a decade Deep Learning, a subset of machine learning have been using for many applications such as forecasting, data visualization, classification etc. However, it consumes more energy and also takes longer training periods for computation, when compared to human brain. In most cases, it is difficult to reach human level performance. With the recent technological improvements in neuroscience and thanks to neuromorphic computing, we now can achieve higher classification efficacy for producing the desired outputs with considerably lower power consumption. Latest advancements in brain simulation technologies has given a breakthrough for analysing and modelling brain functions. Despite its advancements, this research remains undiscovered due to lack of coordination between neuroscientists, electronics engineers and computer scientists. Recent progress in Spiking Neural Networks(SNN) led towards integration different fields under one single roof. Biological neurons inside human brain communicate with each other through synapses. Similarly, bio-inspired synapses in the neuromorphic model mimic the biological neuro synapses for computing. In this novel research, we have modelled a supervised Spiking Neural Network algorithm using Leaky Integrate and Fire (LIF), Izhikevich and rectified linear neurons and tested its spike latency under different conditions. Furthermore, these SNN models are tested on the MNIST dataset to classify the handwritten digits, and the results are compared with the results of the Convolutional Neural Network (CNN).","PeriodicalId":212570,"journal":{"name":"2020 11th International Conference on Awareness Science and Technology (iCAST)","volume":"74 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Improved Spiking Neural Networks with multiple neurons for digit recognition\",\"authors\":\"Vinay Kumar Reddy Chimmula, Lei Zhang, Dhanya Palliath, Abhinay Kumar\",\"doi\":\"10.1109/iCAST51195.2020.9319475\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"For more than a decade Deep Learning, a subset of machine learning have been using for many applications such as forecasting, data visualization, classification etc. However, it consumes more energy and also takes longer training periods for computation, when compared to human brain. In most cases, it is difficult to reach human level performance. With the recent technological improvements in neuroscience and thanks to neuromorphic computing, we now can achieve higher classification efficacy for producing the desired outputs with considerably lower power consumption. Latest advancements in brain simulation technologies has given a breakthrough for analysing and modelling brain functions. Despite its advancements, this research remains undiscovered due to lack of coordination between neuroscientists, electronics engineers and computer scientists. Recent progress in Spiking Neural Networks(SNN) led towards integration different fields under one single roof. Biological neurons inside human brain communicate with each other through synapses. Similarly, bio-inspired synapses in the neuromorphic model mimic the biological neuro synapses for computing. In this novel research, we have modelled a supervised Spiking Neural Network algorithm using Leaky Integrate and Fire (LIF), Izhikevich and rectified linear neurons and tested its spike latency under different conditions. Furthermore, these SNN models are tested on the MNIST dataset to classify the handwritten digits, and the results are compared with the results of the Convolutional Neural Network (CNN).\",\"PeriodicalId\":212570,\"journal\":{\"name\":\"2020 11th International Conference on Awareness Science and Technology (iCAST)\",\"volume\":\"74 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 11th International Conference on Awareness Science and Technology (iCAST)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/iCAST51195.2020.9319475\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 11th International Conference on Awareness Science and Technology (iCAST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iCAST51195.2020.9319475","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
十多年来,深度学习作为机器学习的一个子集已经被用于许多应用,如预测、数据可视化、分类等。然而,与人脑相比,它消耗更多的能量,也需要更长的训练时间来进行计算。在大多数情况下,很难达到人类水平的表现。随着最近神经科学技术的进步和神经形态计算的发展,我们现在可以实现更高的分类效率,以相当低的功耗产生所需的输出。大脑模拟技术的最新进展为分析和模拟大脑功能提供了突破。尽管取得了进展,但由于神经科学家、电子工程师和计算机科学家之间缺乏协调,这项研究仍未被发现。脉冲神经网络(SNN)的最新进展导致了不同领域在一个屋檐下的整合。人类大脑内的生物神经元通过突触相互交流。类似地,神经形态模型中的生物启发突触模仿用于计算的生物神经突触。在这项新颖的研究中,我们使用Leaky Integrate and Fire (LIF), Izhikevich和整流线性神经元建模了一个有监督的峰值神经网络算法,并测试了其在不同条件下的峰值延迟。在MNIST数据集上对SNN模型进行测试,对手写数字进行分类,并将结果与卷积神经网络(CNN)的结果进行比较。
Improved Spiking Neural Networks with multiple neurons for digit recognition
For more than a decade Deep Learning, a subset of machine learning have been using for many applications such as forecasting, data visualization, classification etc. However, it consumes more energy and also takes longer training periods for computation, when compared to human brain. In most cases, it is difficult to reach human level performance. With the recent technological improvements in neuroscience and thanks to neuromorphic computing, we now can achieve higher classification efficacy for producing the desired outputs with considerably lower power consumption. Latest advancements in brain simulation technologies has given a breakthrough for analysing and modelling brain functions. Despite its advancements, this research remains undiscovered due to lack of coordination between neuroscientists, electronics engineers and computer scientists. Recent progress in Spiking Neural Networks(SNN) led towards integration different fields under one single roof. Biological neurons inside human brain communicate with each other through synapses. Similarly, bio-inspired synapses in the neuromorphic model mimic the biological neuro synapses for computing. In this novel research, we have modelled a supervised Spiking Neural Network algorithm using Leaky Integrate and Fire (LIF), Izhikevich and rectified linear neurons and tested its spike latency under different conditions. Furthermore, these SNN models are tested on the MNIST dataset to classify the handwritten digits, and the results are compared with the results of the Convolutional Neural Network (CNN).