Xingwen Zhou , Zhenghao You , Weiguo Sun , Dongdong Zhao , Shi Yan
{"title":"用于深度神经网络的带动量和能量的分数阶随机梯度下降法","authors":"Xingwen Zhou , Zhenghao You , Weiguo Sun , Dongdong Zhao , Shi Yan","doi":"10.1016/j.neunet.2024.106810","DOIUrl":null,"url":null,"abstract":"<div><div>In this paper, a novel fractional-order stochastic gradient descent with momentum and energy (FOSGDME) approach is proposed. Specifically, to address the challenge of converging to a real extreme point encountered by the existing fractional gradient algorithms, a novel fractional-order stochastic gradient descent (FOSGD) method is presented by modifying the definition of the Caputo fractional-order derivative. A FOSGD with moment (FOSGDM) is established by incorporating momentum information to accelerate the convergence speed and accuracy further. In addition, to improve the robustness and accuracy, a FOSGD with moment and energy is established by further introducing energy formation. The extensive experimental results on the image classification CIFAR-10 dataset obtained with ResNet and DenseNet demonstrate that the proposed FOSGD, FOSGDM and FOSGDME algorithms are superior to the integer order optimization algorithms, and achieve state-of-the-art performance.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"181 ","pages":"Article 106810"},"PeriodicalIF":6.0000,"publicationDate":"2024-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Fractional-order stochastic gradient descent method with momentum and energy for deep neural networks\",\"authors\":\"Xingwen Zhou , Zhenghao You , Weiguo Sun , Dongdong Zhao , Shi Yan\",\"doi\":\"10.1016/j.neunet.2024.106810\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>In this paper, a novel fractional-order stochastic gradient descent with momentum and energy (FOSGDME) approach is proposed. Specifically, to address the challenge of converging to a real extreme point encountered by the existing fractional gradient algorithms, a novel fractional-order stochastic gradient descent (FOSGD) method is presented by modifying the definition of the Caputo fractional-order derivative. A FOSGD with moment (FOSGDM) is established by incorporating momentum information to accelerate the convergence speed and accuracy further. In addition, to improve the robustness and accuracy, a FOSGD with moment and energy is established by further introducing energy formation. The extensive experimental results on the image classification CIFAR-10 dataset obtained with ResNet and DenseNet demonstrate that the proposed FOSGD, FOSGDM and FOSGDME algorithms are superior to the integer order optimization algorithms, and achieve state-of-the-art performance.</div></div>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":\"181 \",\"pages\":\"Article 106810\"},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2024-10-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0893608024007342\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608024007342","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Fractional-order stochastic gradient descent method with momentum and energy for deep neural networks
In this paper, a novel fractional-order stochastic gradient descent with momentum and energy (FOSGDME) approach is proposed. Specifically, to address the challenge of converging to a real extreme point encountered by the existing fractional gradient algorithms, a novel fractional-order stochastic gradient descent (FOSGD) method is presented by modifying the definition of the Caputo fractional-order derivative. A FOSGD with moment (FOSGDM) is established by incorporating momentum information to accelerate the convergence speed and accuracy further. In addition, to improve the robustness and accuracy, a FOSGD with moment and energy is established by further introducing energy formation. The extensive experimental results on the image classification CIFAR-10 dataset obtained with ResNet and DenseNet demonstrate that the proposed FOSGD, FOSGDM and FOSGDME algorithms are superior to the integer order optimization algorithms, and achieve state-of-the-art performance.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.