Research on Lightweight Few-Shot Learning Algorithm Based on Convolutional Block Attention Mechanism

IF 0.8 Q4 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE International Journal of Computational Intelligence and Applications Pub Date : 2023-04-11 DOI:10.1142/s1469026823500207
Pang Qi, Yu Yanan, Haile Haftom Berihu
{"title":"Research on Lightweight Few-Shot Learning Algorithm Based on Convolutional Block Attention Mechanism","authors":"Pang Qi, Yu Yanan, Haile Haftom Berihu","doi":"10.1142/s1469026823500207","DOIUrl":null,"url":null,"abstract":"Few-shot learning can solve new learning tasks in the condition of fewer samples. However, currently, the few-shot learning algorithms mostly use the ResNet as a backbone, which leads to a large number of model parameters. To deal with the problem, a lightweight backbone named DenseAttentionNet which is based on the Convolutional Block Attention Mechanism is proposed by comparing the parameter amount and the accuracy of few-shot classification with ResNet-12. Then, based on the DenseAttentionNet, a few-shot learning algorithm called Meta-DenseAttention is presented to balance the model parameters and the classification effect. The dense connection and attention mechanism are combined to meet the requirements of fewer parameters and to achieve a good classification effect for the first time. The experimental results show that the DenseAttentionNet, not only reduces the number of parameters by 55% but also outperforms other classic backbones in the classification effect compared with the ResNet-12 benchmark. In addition, Meta-DenseAttention has an accuracy of 56.57% (5way-1shot) and 72.73% (5way-5shot) on the miniImageNet, although the number of parameters is only 3.6[Formula: see text]M. The experimental results also show that the few-shot learning algorithm proposed in this paper not only guarantees classification accuracy but also has the characteristics of lightweight.","PeriodicalId":45994,"journal":{"name":"International Journal of Computational Intelligence and Applications","volume":" ","pages":""},"PeriodicalIF":0.8000,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computational Intelligence and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s1469026823500207","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Few-shot learning can solve new learning tasks in the condition of fewer samples. However, currently, the few-shot learning algorithms mostly use the ResNet as a backbone, which leads to a large number of model parameters. To deal with the problem, a lightweight backbone named DenseAttentionNet which is based on the Convolutional Block Attention Mechanism is proposed by comparing the parameter amount and the accuracy of few-shot classification with ResNet-12. Then, based on the DenseAttentionNet, a few-shot learning algorithm called Meta-DenseAttention is presented to balance the model parameters and the classification effect. The dense connection and attention mechanism are combined to meet the requirements of fewer parameters and to achieve a good classification effect for the first time. The experimental results show that the DenseAttentionNet, not only reduces the number of parameters by 55% but also outperforms other classic backbones in the classification effect compared with the ResNet-12 benchmark. In addition, Meta-DenseAttention has an accuracy of 56.57% (5way-1shot) and 72.73% (5way-5shot) on the miniImageNet, although the number of parameters is only 3.6[Formula: see text]M. The experimental results also show that the few-shot learning algorithm proposed in this paper not only guarantees classification accuracy but also has the characteristics of lightweight.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于卷积块注意机制的轻量级少镜头学习算法研究
Few-shot学习可以在样本较少的情况下解决新的学习任务。然而,目前的few-shot学习算法大多使用ResNet作为主干,导致模型参数大量。为了解决这一问题,通过与ResNet-12比较少镜头分类的参数数量和准确率,提出了一种基于卷积块注意机制的轻量级骨干网络DenseAttentionNet。然后,在DenseAttentionNet的基础上,提出了一种称为Meta-DenseAttention的少镜头学习算法来平衡模型参数和分类效果。将密集连接与注意机制相结合,满足了参数较少的要求,首次实现了较好的分类效果。实验结果表明,与ResNet-12基准相比,DenseAttentionNet不仅减少了55%的参数数量,而且在分类效果上优于其他经典骨干网。此外,Meta-DenseAttention在miniImageNet上的准确率为56.57% (5way-1shot)和72.73% (5way-5shot),尽管参数数量只有3.6个[公式:见文]M。实验结果还表明,本文提出的少镜头学习算法在保证分类精度的同时,还具有轻量化的特点。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
2.90
自引率
0.00%
发文量
25
期刊介绍: The International Journal of Computational Intelligence and Applications, IJCIA, is a refereed journal dedicated to the theory and applications of computational intelligence (artificial neural networks, fuzzy systems, evolutionary computation and hybrid systems). The main goal of this journal is to provide the scientific community and industry with a vehicle whereby ideas using two or more conventional and computational intelligence based techniques could be discussed. The IJCIA welcomes original works in areas such as neural networks, fuzzy logic, evolutionary computation, pattern recognition, hybrid intelligent systems, symbolic machine learning, statistical models, image/audio/video compression and retrieval.
期刊最新文献
Software Effort Estimation Based on Ensemble Extreme Gradient Boosting Algorithm and Modified Jaya Optimization Algorithm Soybean Leaf Diseases Recognition Based on Generative Adversarial Network and Transfer Learning A Study of Digital Museum Collection Recommendation Algorithm Based on Improved Fuzzy Clustering Algorithm Efficiency in Orchid Species Classification: A Transfer Learning-Based Approach Research on Fault Detection for Microservices Based on Log Information and Social Network Mechanism Using BiLSTM-DCNN Model
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1