ICFD: An Incremental Learning Method Based on Data Feature Distribution

Yunzhe Zhu, Yusong Tan, Xiaoling Li, Qingbo Wu, Xueqin Ning
{"title":"ICFD: An Incremental Learning Method Based on Data Feature Distribution","authors":"Yunzhe Zhu, Yusong Tan, Xiaoling Li, Qingbo Wu, Xueqin Ning","doi":"10.1109/SmartWorld-UIC-ATC-ScalCom-DigitalTwin-PriComp-Metaverse56740.2022.00103","DOIUrl":null,"url":null,"abstract":"Neural network models have achieved great success in numerous disciplines in recent years, including image segmentation, object identification, and natural language processing (NLP). Incremental learning in these fields focuses on training models in a continuous data stream. As time goes by, more new data becomes available, and old data may become unavailable owing to resource constraints such as storage. As a result, when new data is continually arriving, the performance of the neural network model on the old data sample sometimes decreases significantly, a phenomenon known as catastrophic forgetting. Many corresponding strategies have been proposed to mitigate the catastrophic forgetting of neural network models, which are based on parameter regularization, data replay, and parameter isolation. This paper proposes an incremental learning method based on data feature distribution (ICFD). The method uses Gaussian distribution to generate features from old data to train neural network models based on the phenomenon that feature vectors obey multi-dimensional Gaussian distribution in feature space. This method avoids storing a large number of original samples, and the generated old class features contain more sample information. This method combines data playback and parameter regularization in concrete implementation. The experimental results of ICFD on the CIFAR-100 demonstrate that when the incremental step is 5, the average incremental accuracy is increased by 10.4%. When the incremental step is 10, the average incremental accuracy is improved by 8.1%.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":null,"pages":null},"PeriodicalIF":0.9000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Scalable Computing-Practice and Experience","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SmartWorld-UIC-ATC-ScalCom-DigitalTwin-PriComp-Metaverse56740.2022.00103","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0

Abstract

Neural network models have achieved great success in numerous disciplines in recent years, including image segmentation, object identification, and natural language processing (NLP). Incremental learning in these fields focuses on training models in a continuous data stream. As time goes by, more new data becomes available, and old data may become unavailable owing to resource constraints such as storage. As a result, when new data is continually arriving, the performance of the neural network model on the old data sample sometimes decreases significantly, a phenomenon known as catastrophic forgetting. Many corresponding strategies have been proposed to mitigate the catastrophic forgetting of neural network models, which are based on parameter regularization, data replay, and parameter isolation. This paper proposes an incremental learning method based on data feature distribution (ICFD). The method uses Gaussian distribution to generate features from old data to train neural network models based on the phenomenon that feature vectors obey multi-dimensional Gaussian distribution in feature space. This method avoids storing a large number of original samples, and the generated old class features contain more sample information. This method combines data playback and parameter regularization in concrete implementation. The experimental results of ICFD on the CIFAR-100 demonstrate that when the incremental step is 5, the average incremental accuracy is increased by 10.4%. When the incremental step is 10, the average incremental accuracy is improved by 8.1%.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种基于数据特征分布的增量学习方法
近年来,神经网络模型在图像分割、目标识别和自然语言处理(NLP)等众多领域取得了巨大的成功。这些领域的增量学习侧重于在连续数据流中训练模型。随着时间的推移,越来越多的新数据变得可用,而旧数据可能由于存储等资源限制而不可用。因此,当新数据不断到来时,神经网络模型在旧数据样本上的表现有时会显著下降,这种现象被称为灾难性遗忘。为了减轻神经网络模型的灾难性遗忘,人们提出了许多相应的策略,包括参数正则化、数据重放和参数隔离。提出了一种基于数据特征分布(ICFD)的增量学习方法。该方法利用特征向量在特征空间服从多维高斯分布的现象,利用高斯分布从旧数据中生成特征来训练神经网络模型。这种方法避免了存储大量的原始样本,并且生成的旧类特征包含更多的样本信息。该方法在具体实现中结合了数据回放和参数正则化。ICFD在CIFAR-100上的实验结果表明,当增量步长为5时,平均增量精度提高了10.4%。当增量步长为10时,平均增量精度提高8.1%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Scalable Computing-Practice and Experience
Scalable Computing-Practice and Experience COMPUTER SCIENCE, SOFTWARE ENGINEERING-
CiteScore
2.00
自引率
0.00%
发文量
10
期刊介绍: The area of scalable computing has matured and reached a point where new issues and trends require a professional forum. SCPE will provide this avenue by publishing original refereed papers that address the present as well as the future of parallel and distributed computing. The journal will focus on algorithm development, implementation and execution on real-world parallel architectures, and application of parallel and distributed computing to the solution of real-life problems.
期刊最新文献
A Deep LSTM-RNN Classification Method for Covid-19 Twitter Review Based on Sentiment Analysis Flexible English Learning Platform using Collaborative Cloud-Fog-Edge Networking Computer Malicious Code Signal Detection based on Big Data Technology Analyzing Spectator Emotions and Behaviors at Live Sporting Events using Computer Vision and Sentiment Analysis Techniques Spacecraft Test Data Integration Management Technology based on Big Data Platform
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1