A single fast Hebbian-like process enabling one-shot class addition in deep neural networks without backbone modification

Kazufumi Hosoda, Keigo Nishida, S. Seno, Tomohiro Mashita, Hideki Kashioka, Izumi Ohzawa
{"title":"A single fast Hebbian-like process enabling one-shot class addition in deep neural networks without backbone modification","authors":"Kazufumi Hosoda, Keigo Nishida, S. Seno, Tomohiro Mashita, Hideki Kashioka, Izumi Ohzawa","doi":"10.3389/fnins.2024.1344114","DOIUrl":null,"url":null,"abstract":"One-shot learning, the ability to learn a new concept from a single instance, is a distinctive brain function that has garnered substantial interest in machine learning. While modeling physiological mechanisms poses challenges, advancements in artificial neural networks have led to performances in specific tasks that rival human capabilities. Proposing one-shot learning methods with these advancements, especially those involving simple mechanisms, not only enhance technological development but also contribute to neuroscience by proposing functionally valid hypotheses. Among the simplest methods for one-shot class addition with deep learning image classifiers is “weight imprinting,” which uses neural activity from a new class image data as the corresponding new synaptic weights. Despite its simplicity, its relevance to neuroscience is ambiguous, and it often interferes with original image classification, which is a significant drawback in practical applications. This study introduces a novel interpretation where a part of the weight imprinting process aligns with the Hebbian rule. We show that a single Hebbian-like process enables pre-trained deep learning image classifiers to perform one-shot class addition without any modification to the original classifier's backbone. Using non-parametric normalization to mimic brain's fast Hebbian plasticity significantly reduces the interference observed in previous methods. Our method is one of the simplest and most practical for one-shot class addition tasks, and its reliance on a single fast Hebbian-like process contributes valuable insights to neuroscience hypotheses.","PeriodicalId":509131,"journal":{"name":"Frontiers in Neuroscience","volume":"19 3","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Neuroscience","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/fnins.2024.1344114","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

One-shot learning, the ability to learn a new concept from a single instance, is a distinctive brain function that has garnered substantial interest in machine learning. While modeling physiological mechanisms poses challenges, advancements in artificial neural networks have led to performances in specific tasks that rival human capabilities. Proposing one-shot learning methods with these advancements, especially those involving simple mechanisms, not only enhance technological development but also contribute to neuroscience by proposing functionally valid hypotheses. Among the simplest methods for one-shot class addition with deep learning image classifiers is “weight imprinting,” which uses neural activity from a new class image data as the corresponding new synaptic weights. Despite its simplicity, its relevance to neuroscience is ambiguous, and it often interferes with original image classification, which is a significant drawback in practical applications. This study introduces a novel interpretation where a part of the weight imprinting process aligns with the Hebbian rule. We show that a single Hebbian-like process enables pre-trained deep learning image classifiers to perform one-shot class addition without any modification to the original classifier's backbone. Using non-parametric normalization to mimic brain's fast Hebbian plasticity significantly reduces the interference observed in previous methods. Our method is one of the simplest and most practical for one-shot class addition tasks, and its reliance on a single fast Hebbian-like process contributes valuable insights to neuroscience hypotheses.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
无需修改骨干网就能在深度神经网络中实现一次类添加的单一快速海比恩类过程
单次学习,即从单个实例中学习新概念的能力,是一种独特的大脑功能,在机器学习领域引起了极大的兴趣。虽然生理机制建模是一项挑战,但人工神经网络的进步已使特定任务的表现可与人类能力相媲美。利用这些进步提出单次学习方法,尤其是涉及简单机制的方法,不仅能促进技术发展,还能提出功能上有效的假设,从而为神经科学做出贡献。利用深度学习图像分类器进行单次类添加的最简单方法之一是 "权重印记",它使用来自新类图像数据的神经活动作为相应的新突触权重。尽管 "权重印记 "非常简单,但它与神经科学的相关性并不明确,而且经常会干扰原始图像分类,这在实际应用中是一个重大缺陷。本研究引入了一种新的解释,即权重印记过程的一部分与海比规则相一致。我们的研究表明,一个类似于希比安的过程就能使预先训练好的深度学习图像分类器在不对原始分类器的骨干进行任何修改的情况下执行一次类添加。利用非参数归一化模拟大脑的快速希比可塑性,大大减少了以往方法中观察到的干扰。我们的方法是单次类添加任务中最简单、最实用的方法之一,它对单一快速海比过程的依赖为神经科学假说提供了有价值的见解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Systems genetics identifies methionine as a high risk factor for Alzheimer's disease Limbic oxytocin receptor expression alters molecular signaling and social avoidance behavior in female prairie voles (Microtus ochrogaster) Editorial: Development of circadian clock functions, volume II Alpha and theta oscillations on a visual strategic processing task in age-related hearing loss Blocking Aδ- and C-fiber neural transmission by sub-kilohertz peripheral nerve stimulation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1