Exploring Batch Normalization’s Impact on Dense Layers of Multiclass and Multilabel Classifiers

IF 5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE International Journal of Intelligent Systems Pub Date : 2025-02-24 DOI:10.1155/int/1466655
Misganaw Aguate Widneh, Amlakie Aschale Alemu, Dereje Derib Getie
{"title":"Exploring Batch Normalization’s Impact on Dense Layers of Multiclass and Multilabel Classifiers","authors":"Misganaw Aguate Widneh,&nbsp;Amlakie Aschale Alemu,&nbsp;Dereje Derib Getie","doi":"10.1155/int/1466655","DOIUrl":null,"url":null,"abstract":"<div>\n <p>Leveraging batch normalization (BN) is crucial for deep learning for quick and precise identification of objects. It is a commonly used approach to reduce the variation of input distribution using BN. However, the complex parameters computed at the classifier layer of convolutional neural network (CNN) are the reason for model overfitting and consumption of long training time. This study is proposed to make a comparative analysis of models’ performances on multiclass and multilabel classifiers with and without BN at dense layers of CNN. Consequently, for both classifications, BN layers are incorporated at the fully connected layer of CNN. To build a model, we used datasets of medical plant leaves, potato leaves, and fashion images. The pretrained models such as Mobile Net, VGG16, and Inception Net are customized (tuned) using the transfer learning technique. We made adjustments to training and model hyperparameters, including batch size, number of layers, learning rate, number of epochs, and optimizers. After several experiments on the three models, we observed that the best way to improve the model’s accuracy is by applying BN at the CNN’s dense layer. BN improved the performances of the models on both multiclass and multilabel classifications. This improvement has more significant change in the multilabel classification. Hence, using medicinal plant dataset, the model achieved accuracy of 93% and 83% for multilabel with and without BN, respectively, while achieving 99.2% and 99% for multiclass classification. The experiment also proved that the effectiveness of BN is affected on type datasets, depth of CNN, and batch sizes.</p>\n </div>","PeriodicalId":14089,"journal":{"name":"International Journal of Intelligent Systems","volume":"2025 1","pages":""},"PeriodicalIF":5.0000,"publicationDate":"2025-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1155/int/1466655","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Intelligent Systems","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1155/int/1466655","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Leveraging batch normalization (BN) is crucial for deep learning for quick and precise identification of objects. It is a commonly used approach to reduce the variation of input distribution using BN. However, the complex parameters computed at the classifier layer of convolutional neural network (CNN) are the reason for model overfitting and consumption of long training time. This study is proposed to make a comparative analysis of models’ performances on multiclass and multilabel classifiers with and without BN at dense layers of CNN. Consequently, for both classifications, BN layers are incorporated at the fully connected layer of CNN. To build a model, we used datasets of medical plant leaves, potato leaves, and fashion images. The pretrained models such as Mobile Net, VGG16, and Inception Net are customized (tuned) using the transfer learning technique. We made adjustments to training and model hyperparameters, including batch size, number of layers, learning rate, number of epochs, and optimizers. After several experiments on the three models, we observed that the best way to improve the model’s accuracy is by applying BN at the CNN’s dense layer. BN improved the performances of the models on both multiclass and multilabel classifications. This improvement has more significant change in the multilabel classification. Hence, using medicinal plant dataset, the model achieved accuracy of 93% and 83% for multilabel with and without BN, respectively, while achieving 99.2% and 99% for multiclass classification. The experiment also proved that the effectiveness of BN is affected on type datasets, depth of CNN, and batch sizes.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用批量归一化(BN)对于深度学习快速、精确地识别物体至关重要。利用 BN 减少输入分布的变化是一种常用的方法。然而,卷积神经网络(CNN)分类器层计算的复杂参数是导致模型过拟合和消耗较长训练时间的原因。本研究的目的是比较分析在 CNN 的密集层上使用和不使用 BN 的模型在多类和多标签分类器上的表现。因此,对于这两种分类,BN 层都被纳入了 CNN 的全连接层。为了建立模型,我们使用了医学植物叶片、土豆叶片和时尚图片等数据集。我们使用迁移学习技术对 Mobile Net、VGG16 和 Inception Net 等预训练模型进行了定制(调整)。我们对训练和模型超参数进行了调整,包括批量大小、层数、学习率、历时数和优化器。经过对三个模型的多次实验,我们发现提高模型准确性的最佳方法是在 CNN 的密集层应用 BN。BN 提高了模型在多类和多标签分类上的性能。这种改进在多标签分类中的变化更为明显。因此,使用药用植物数据集,模型在使用和不使用 BN 的情况下,多标签分类的准确率分别达到了 93% 和 83%,而多标签分类的准确率则分别达到了 99.2% 和 99%。实验还证明,BN 的有效性受数据集类型、CNN 深度和批量大小的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
International Journal of Intelligent Systems
International Journal of Intelligent Systems 工程技术-计算机:人工智能
CiteScore
11.30
自引率
14.30%
发文量
304
审稿时长
9 months
期刊介绍: The International Journal of Intelligent Systems serves as a forum for individuals interested in tapping into the vast theories based on intelligent systems construction. With its peer-reviewed format, the journal explores several fascinating editorials written by today''s experts in the field. Because new developments are being introduced each day, there''s much to be learned — examination, analysis creation, information retrieval, man–computer interactions, and more. The International Journal of Intelligent Systems uses charts and illustrations to demonstrate these ground-breaking issues, and encourages readers to share their thoughts and experiences.
期刊最新文献
Consistency Regularization Semisupervised Learning for PolSAR Image Classification Strategies to Mitigate Model Drift of a Machine Learning Prediction Model for Acute Kidney Injury in General Hospitalization Exploring Batch Normalization’s Impact on Dense Layers of Multiclass and Multilabel Classifiers Modeling and Recognition of Retinal Blood Vessels Tortuosity in ROP Plus Disease: A Hybrid Segmentation–Classification Scheme Intelligent Sensing and Identification of Spectrum Anomalies With Alpha-Stable Noise
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1