距离前向学习:增强前向算法,实现高性能片上学习

Yujie Wu, Siyuan Xu, Jibin Wu, Lei Deng, Mingkun Xu, Qinghao Wen, Guoqi Li
{"title":"距离前向学习:增强前向算法,实现高性能片上学习","authors":"Yujie Wu, Siyuan Xu, Jibin Wu, Lei Deng, Mingkun Xu, Qinghao Wen, Guoqi Li","doi":"arxiv-2408.14925","DOIUrl":null,"url":null,"abstract":"The Forward-Forward (FF) algorithm was recently proposed as a local learning\nmethod to address the limitations of backpropagation (BP), offering biological\nplausibility along with memory-efficient and highly parallelized computational\nbenefits. However, it suffers from suboptimal performance and poor\ngeneralization, largely due to inadequate theoretical support and a lack of\neffective learning strategies. In this work, we reformulate FF using distance\nmetric learning and propose a distance-forward algorithm (DF) to improve FF\nperformance in supervised vision tasks while preserving its local computational\nproperties, making it competitive for efficient on-chip learning. To achieve\nthis, we reinterpret FF through the lens of centroid-based metric learning and\ndevelop a goodness-based N-pair margin loss to facilitate the learning of\ndiscriminative features. Furthermore, we integrate layer-collaboration local\nupdate strategies to reduce information loss caused by greedy local parameter\nupdates. Our method surpasses existing FF models and other advanced local\nlearning approaches, with accuracies of 99.7\\% on MNIST, 88.2\\% on CIFAR-10,\n59\\% on CIFAR-100, 95.9\\% on SVHN, and 82.5\\% on ImageNette, respectively.\nMoreover, it achieves comparable performance with less than 40\\% memory cost\ncompared to BP training, while exhibiting stronger robustness to multiple types\nof hardware-related noise, demonstrating its potential for online learning and\nenergy-efficient computation on neuromorphic chips.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Distance-Forward Learning: Enhancing the Forward-Forward Algorithm Towards High-Performance On-Chip Learning\",\"authors\":\"Yujie Wu, Siyuan Xu, Jibin Wu, Lei Deng, Mingkun Xu, Qinghao Wen, Guoqi Li\",\"doi\":\"arxiv-2408.14925\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Forward-Forward (FF) algorithm was recently proposed as a local learning\\nmethod to address the limitations of backpropagation (BP), offering biological\\nplausibility along with memory-efficient and highly parallelized computational\\nbenefits. However, it suffers from suboptimal performance and poor\\ngeneralization, largely due to inadequate theoretical support and a lack of\\neffective learning strategies. In this work, we reformulate FF using distance\\nmetric learning and propose a distance-forward algorithm (DF) to improve FF\\nperformance in supervised vision tasks while preserving its local computational\\nproperties, making it competitive for efficient on-chip learning. To achieve\\nthis, we reinterpret FF through the lens of centroid-based metric learning and\\ndevelop a goodness-based N-pair margin loss to facilitate the learning of\\ndiscriminative features. Furthermore, we integrate layer-collaboration local\\nupdate strategies to reduce information loss caused by greedy local parameter\\nupdates. Our method surpasses existing FF models and other advanced local\\nlearning approaches, with accuracies of 99.7\\\\% on MNIST, 88.2\\\\% on CIFAR-10,\\n59\\\\% on CIFAR-100, 95.9\\\\% on SVHN, and 82.5\\\\% on ImageNette, respectively.\\nMoreover, it achieves comparable performance with less than 40\\\\% memory cost\\ncompared to BP training, while exhibiting stronger robustness to multiple types\\nof hardware-related noise, demonstrating its potential for online learning and\\nenergy-efficient computation on neuromorphic chips.\",\"PeriodicalId\":501347,\"journal\":{\"name\":\"arXiv - CS - Neural and Evolutionary Computing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Neural and Evolutionary Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2408.14925\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.14925","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

前向前馈(FF)算法是最近提出的一种局部学习方法,旨在解决反向传播(BP)的局限性,该算法不仅具有生物学上的合理性,还具有内存效率高、计算高度并行化等优点。然而,它的性能不理想,泛化能力差,这主要是由于理论支持不足和缺乏有效的学习策略。在这项工作中,我们使用距离度量学习重新表述了 FF,并提出了一种距离前向算法 (DF),以提高 FF 在有监督视觉任务中的性能,同时保留其本地计算特性,使其在高效片上学习方面具有竞争力。为了实现这一目标,我们从基于中心点的度量学习角度重新解释了 FF,并开发了一种基于善度的 N 对边距损失,以促进区分性特征的学习。此外,我们还整合了层协作局部更新策略,以减少贪婪的局部参数更新造成的信息损失。我们的方法超越了现有的FF模型和其他先进的局部学习方法,在MNIST上的准确率为99.7%,在CIFAR-10上的准确率为88.2%,在CIFAR-100上的准确率为59%,在SVHN上的准确率为95.9%,在ImageNette上的准确率为82.5%。此外,与BP训练相比,它以不到40%的内存成本实现了可比的性能,同时对多种类型的硬件相关噪声表现出更强的鲁棒性,证明了它在神经形态芯片上的在线学习和节能计算潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Distance-Forward Learning: Enhancing the Forward-Forward Algorithm Towards High-Performance On-Chip Learning
The Forward-Forward (FF) algorithm was recently proposed as a local learning method to address the limitations of backpropagation (BP), offering biological plausibility along with memory-efficient and highly parallelized computational benefits. However, it suffers from suboptimal performance and poor generalization, largely due to inadequate theoretical support and a lack of effective learning strategies. In this work, we reformulate FF using distance metric learning and propose a distance-forward algorithm (DF) to improve FF performance in supervised vision tasks while preserving its local computational properties, making it competitive for efficient on-chip learning. To achieve this, we reinterpret FF through the lens of centroid-based metric learning and develop a goodness-based N-pair margin loss to facilitate the learning of discriminative features. Furthermore, we integrate layer-collaboration local update strategies to reduce information loss caused by greedy local parameter updates. Our method surpasses existing FF models and other advanced local learning approaches, with accuracies of 99.7\% on MNIST, 88.2\% on CIFAR-10, 59\% on CIFAR-100, 95.9\% on SVHN, and 82.5\% on ImageNette, respectively. Moreover, it achieves comparable performance with less than 40\% memory cost compared to BP training, while exhibiting stronger robustness to multiple types of hardware-related noise, demonstrating its potential for online learning and energy-efficient computation on neuromorphic chips.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Hardware-Friendly Implementation of Physical Reservoir Computing with CMOS-based Time-domain Analog Spiking Neurons Self-Contrastive Forward-Forward Algorithm Bio-Inspired Mamba: Temporal Locality and Bioplausible Learning in Selective State Space Models PReLU: Yet Another Single-Layer Solution to the XOR Problem Inferno: An Extensible Framework for Spiking Neural Networks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1