{"title":"Modified Backpropagation Algorithm with Multiplicative Calculus in Neural Networks","authors":"Serkan Ozbay","doi":"10.5755/j02.eie.34105","DOIUrl":null,"url":null,"abstract":"Backpropagation is one of the most widely used algorithms for training feedforward deep neural networks. The algorithm requires a differentiable activation function and it performs computations of the gradient proceeding backwards through the feedforward deep neural network from the last layer through to the first layer. In order to calculate the gradient at a specific layer, the gradients of all layers are combined via the chain rule of calculus. One of the biggest disadvantages of the backpropagation is that it requires a large amount of training time. To overcome this issue, this paper proposes a modified backpropagation algorithm with multiplicative calculus. Multiplicative calculus provides an alternative to the classical calculus and it defines new kinds of derivative and integral forms in multiplicative form rather than addition and subtraction forms. The performance analyzes are discussed in various case studies and the results are given comparatively with classical backpropagation algorithm. It is found that the proposed modified backpropagation algorithm converges in less time to the solution and thus provides fast training in the given case studies. It is also shown that the proposed algorithm avoids the local minima problem.","PeriodicalId":51031,"journal":{"name":"Elektronika Ir Elektrotechnika","volume":" ","pages":""},"PeriodicalIF":0.9000,"publicationDate":"2023-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Elektronika Ir Elektrotechnika","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.5755/j02.eie.34105","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Backpropagation is one of the most widely used algorithms for training feedforward deep neural networks. The algorithm requires a differentiable activation function and it performs computations of the gradient proceeding backwards through the feedforward deep neural network from the last layer through to the first layer. In order to calculate the gradient at a specific layer, the gradients of all layers are combined via the chain rule of calculus. One of the biggest disadvantages of the backpropagation is that it requires a large amount of training time. To overcome this issue, this paper proposes a modified backpropagation algorithm with multiplicative calculus. Multiplicative calculus provides an alternative to the classical calculus and it defines new kinds of derivative and integral forms in multiplicative form rather than addition and subtraction forms. The performance analyzes are discussed in various case studies and the results are given comparatively with classical backpropagation algorithm. It is found that the proposed modified backpropagation algorithm converges in less time to the solution and thus provides fast training in the given case studies. It is also shown that the proposed algorithm avoids the local minima problem.
期刊介绍:
The journal aims to attract original research papers on featuring practical developments in the field of electronics and electrical engineering. The journal seeks to publish research progress in the field of electronics and electrical engineering with an emphasis on the applied rather than the theoretical in as much detail as possible.
The journal publishes regular papers dealing with the following areas, but not limited to:
Electronics;
Electronic Measurements;
Signal Technology;
Microelectronics;
High Frequency Technology, Microwaves.
Electrical Engineering;
Renewable Energy;
Automation, Robotics;
Telecommunications Engineering.