{"title":"Optimizing ResNet50 performance using stochastic gradient descent on MRI images for Alzheimer's disease classification","authors":"Mohamed Amine Mahjoubi , Driss Lamrani , Shawki Saleh , Wassima Moutaouakil , Asmae Ouhmida , Soufiane Hamida , Bouchaib Cherradi , Abdelhadi Raihani","doi":"10.1016/j.ibmed.2025.100219","DOIUrl":null,"url":null,"abstract":"<div><div>The field of optimization is focused on the formulation, analysis, and resolution of problems involving the minimization or maximization of functions. A particular subclass of optimization problems, known as empirical risk minimization, involves fitting a model to observed data. These problems play a central role in various areas such as machine learning, statistical modeling, and decision theory, where the objective is to find a model that best approximates underlying patterns in the data by minimizing a specified loss or risk function. In deep learning (DL) systems, various optimization algorithms are utilized with the gradient descent (GD) algorithm being one of the most significant and effective. Research studies have improved the GD algorithm and developed various successful variants, including stochastic gradient descent (SGD) with momentum, AdaGrad, RMSProp, and Adam. This article provides a comparative analysis of these stochastic gradient descent algorithms based on their accuracy, loss, and training time, as well as the loss of each algorithm in generating an optimization solution. Experiments were conducted using Transfer Learning (TL) technique based on the pre-trained ResNet50 base model for image classification, with a focus on stochastic gradient (SG) for performances optimization. The case study in this work is based on a data extract from the Alzheimer's image dataset, which contains four classes such as Mild Demented, Moderate Demented, Non-Demented, and Very Mild Demented. The obtained results with the Adam and SGD momentum optimizers achieved the highest accuracy of 97.66 % and 97.58 %, respectively.</div></div>","PeriodicalId":73399,"journal":{"name":"Intelligence-based medicine","volume":"11 ","pages":"Article 100219"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intelligence-based medicine","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666521225000225","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The field of optimization is focused on the formulation, analysis, and resolution of problems involving the minimization or maximization of functions. A particular subclass of optimization problems, known as empirical risk minimization, involves fitting a model to observed data. These problems play a central role in various areas such as machine learning, statistical modeling, and decision theory, where the objective is to find a model that best approximates underlying patterns in the data by minimizing a specified loss or risk function. In deep learning (DL) systems, various optimization algorithms are utilized with the gradient descent (GD) algorithm being one of the most significant and effective. Research studies have improved the GD algorithm and developed various successful variants, including stochastic gradient descent (SGD) with momentum, AdaGrad, RMSProp, and Adam. This article provides a comparative analysis of these stochastic gradient descent algorithms based on their accuracy, loss, and training time, as well as the loss of each algorithm in generating an optimization solution. Experiments were conducted using Transfer Learning (TL) technique based on the pre-trained ResNet50 base model for image classification, with a focus on stochastic gradient (SG) for performances optimization. The case study in this work is based on a data extract from the Alzheimer's image dataset, which contains four classes such as Mild Demented, Moderate Demented, Non-Demented, and Very Mild Demented. The obtained results with the Adam and SGD momentum optimizers achieved the highest accuracy of 97.66 % and 97.58 %, respectively.