{"title":"具有前馈抑制的高级卷积神经网络","authors":"Lu Liu, Shuling Yang, D. Shi","doi":"10.1109/ICMLC48188.2019.8949229","DOIUrl":null,"url":null,"abstract":"Convolutional neural network is a multi-layer neural network with robust pattern recognition ability. However, when the activation function is sigmoid, the convolutional neural network produces gradient vanishing problem. First, this paper analyzes the gradient vanishing problem, and then based on the balance of excitation and inhibition mechanism in neurology, it is proposed to use feed-forward inhibition to reduce activition value and wipe off the scale effect of weights, so that the model can accelerate convergence under the premise of maintaining the nonlinear fitting ability. The results show that the improved convolutional neural network can effectively relieve the gradient vanishing problem.","PeriodicalId":221349,"journal":{"name":"2019 International Conference on Machine Learning and Cybernetics (ICMLC)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Advanced Convolutional Neural Network With Feedforward Inhibition\",\"authors\":\"Lu Liu, Shuling Yang, D. Shi\",\"doi\":\"10.1109/ICMLC48188.2019.8949229\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Convolutional neural network is a multi-layer neural network with robust pattern recognition ability. However, when the activation function is sigmoid, the convolutional neural network produces gradient vanishing problem. First, this paper analyzes the gradient vanishing problem, and then based on the balance of excitation and inhibition mechanism in neurology, it is proposed to use feed-forward inhibition to reduce activition value and wipe off the scale effect of weights, so that the model can accelerate convergence under the premise of maintaining the nonlinear fitting ability. The results show that the improved convolutional neural network can effectively relieve the gradient vanishing problem.\",\"PeriodicalId\":221349,\"journal\":{\"name\":\"2019 International Conference on Machine Learning and Cybernetics (ICMLC)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 International Conference on Machine Learning and Cybernetics (ICMLC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMLC48188.2019.8949229\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Machine Learning and Cybernetics (ICMLC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLC48188.2019.8949229","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Advanced Convolutional Neural Network With Feedforward Inhibition
Convolutional neural network is a multi-layer neural network with robust pattern recognition ability. However, when the activation function is sigmoid, the convolutional neural network produces gradient vanishing problem. First, this paper analyzes the gradient vanishing problem, and then based on the balance of excitation and inhibition mechanism in neurology, it is proposed to use feed-forward inhibition to reduce activition value and wipe off the scale effect of weights, so that the model can accelerate convergence under the premise of maintaining the nonlinear fitting ability. The results show that the improved convolutional neural network can effectively relieve the gradient vanishing problem.