{"title":"Fractional Stochastic Gradient Descent Based Learning Algorithm For Multi-layer Perceptron Neural Networks","authors":"A. Sadiq, N. Yahya","doi":"10.1109/ICIAS49414.2021.9642687","DOIUrl":null,"url":null,"abstract":"Neural Networks are indispensable tools in adaptive signal processing. Multi-layer perceptron (MLP) neural network is one of the most widely used neural network architecture. The performance is highly subjective to the optimization of learning parameters. In this study, we propose a learning algorithm for the training of MLP models. Conventionally back-propagation learning algorithm also termed as (BP-MLP) is used. It is a type of stochastic gradient descent algorithm where performance is governed by eigen spread of the input signal correlation matrix. In order to accelerate the performance, we design a combination of integral and fractional gradient terms. The proposed fractional back-propagation multi-layer perceptron (FBP-MLP) method is based on fractional calculus and it utilizes the concept of fractional power gradient which provides complementary information about the cost function that helps in rapid convergence. For the validation of our claim, we implemented leukemia cancer classification task and compared our method with standard BPMLP method. The proposed FBP-MLP method outperformed the conventional BP-MLP algorithm both in terms of convergence rate and test accuracy.","PeriodicalId":212635,"journal":{"name":"2020 8th International Conference on Intelligent and Advanced Systems (ICIAS)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 8th International Conference on Intelligent and Advanced Systems (ICIAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIAS49414.2021.9642687","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Neural Networks are indispensable tools in adaptive signal processing. Multi-layer perceptron (MLP) neural network is one of the most widely used neural network architecture. The performance is highly subjective to the optimization of learning parameters. In this study, we propose a learning algorithm for the training of MLP models. Conventionally back-propagation learning algorithm also termed as (BP-MLP) is used. It is a type of stochastic gradient descent algorithm where performance is governed by eigen spread of the input signal correlation matrix. In order to accelerate the performance, we design a combination of integral and fractional gradient terms. The proposed fractional back-propagation multi-layer perceptron (FBP-MLP) method is based on fractional calculus and it utilizes the concept of fractional power gradient which provides complementary information about the cost function that helps in rapid convergence. For the validation of our claim, we implemented leukemia cancer classification task and compared our method with standard BPMLP method. The proposed FBP-MLP method outperformed the conventional BP-MLP algorithm both in terms of convergence rate and test accuracy.