{"title":"优化算法对人脸识别神经网络性能的影响","authors":"M. Ali, D. Kumar","doi":"10.55579/jaec.202264.370","DOIUrl":null,"url":null,"abstract":"Face recognition has aroused great interest in a range of industries due to its practical applications nowadays. It is a biometric method that is used to identify and certify people with unique biological traits in a reliable and timely manner. Although iris and fingerprint recognition technologies are more accurate, face recognition technology is the most common and frequently utilized since it is simple to deploy and execute and does not require any physical input from the user. This study compares Neural Networks using (SGD, Adam, or L-BFGS-B) optimizers, with different activation functions (Sigmoid, Tanh, or ReLU), and deep learning feature extraction methodologies including Squeeze Net, VGG19, or Inception model. The inception model outperforms the Squeeze Net and VGG19 in terms of accuracy. Based on the findings of the inception model, we achieved 93.6% of accuracy in a neural network with four layers and forty neurons by utilizing the SGD optimizer with the ReLU activation function. We also noticed that using the ReLU activation function with any of the three optimizers achieved the best results based on findings of the inception model, as it achieved 93.6%, 89.1%, and 94% of accuracy for each of the optimization algorithms SGD, Adam, and BFGS, respectively.This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium provided the original work is properly cited.","PeriodicalId":250655,"journal":{"name":"J. Adv. Eng. Comput.","volume":"77 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Impact of Optimization Algorithms on The Performance of Face Recognition Neural Networks\",\"authors\":\"M. Ali, D. Kumar\",\"doi\":\"10.55579/jaec.202264.370\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Face recognition has aroused great interest in a range of industries due to its practical applications nowadays. It is a biometric method that is used to identify and certify people with unique biological traits in a reliable and timely manner. Although iris and fingerprint recognition technologies are more accurate, face recognition technology is the most common and frequently utilized since it is simple to deploy and execute and does not require any physical input from the user. This study compares Neural Networks using (SGD, Adam, or L-BFGS-B) optimizers, with different activation functions (Sigmoid, Tanh, or ReLU), and deep learning feature extraction methodologies including Squeeze Net, VGG19, or Inception model. The inception model outperforms the Squeeze Net and VGG19 in terms of accuracy. Based on the findings of the inception model, we achieved 93.6% of accuracy in a neural network with four layers and forty neurons by utilizing the SGD optimizer with the ReLU activation function. We also noticed that using the ReLU activation function with any of the three optimizers achieved the best results based on findings of the inception model, as it achieved 93.6%, 89.1%, and 94% of accuracy for each of the optimization algorithms SGD, Adam, and BFGS, respectively.This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium provided the original work is properly cited.\",\"PeriodicalId\":250655,\"journal\":{\"name\":\"J. Adv. Eng. Comput.\",\"volume\":\"77 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"J. Adv. Eng. Comput.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.55579/jaec.202264.370\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"J. Adv. Eng. Comput.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.55579/jaec.202264.370","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The Impact of Optimization Algorithms on The Performance of Face Recognition Neural Networks
Face recognition has aroused great interest in a range of industries due to its practical applications nowadays. It is a biometric method that is used to identify and certify people with unique biological traits in a reliable and timely manner. Although iris and fingerprint recognition technologies are more accurate, face recognition technology is the most common and frequently utilized since it is simple to deploy and execute and does not require any physical input from the user. This study compares Neural Networks using (SGD, Adam, or L-BFGS-B) optimizers, with different activation functions (Sigmoid, Tanh, or ReLU), and deep learning feature extraction methodologies including Squeeze Net, VGG19, or Inception model. The inception model outperforms the Squeeze Net and VGG19 in terms of accuracy. Based on the findings of the inception model, we achieved 93.6% of accuracy in a neural network with four layers and forty neurons by utilizing the SGD optimizer with the ReLU activation function. We also noticed that using the ReLU activation function with any of the three optimizers achieved the best results based on findings of the inception model, as it achieved 93.6%, 89.1%, and 94% of accuracy for each of the optimization algorithms SGD, Adam, and BFGS, respectively.This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium provided the original work is properly cited.