Toward Building Human-Like Sequential Memory Using Brain-Inspired Spiking Neural Models

IF 8.9 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE IEEE transactions on neural networks and learning systems Pub Date : 2025-03-14 DOI:10.1109/TNNLS.2025.3543673
Malu Zhang;Xiaoling Luo;Jibin Wu;Ammar Belatreche;Siqi Cai;Yang Yang;Haizhou Li
{"title":"Toward Building Human-Like Sequential Memory Using Brain-Inspired Spiking Neural Models","authors":"Malu Zhang;Xiaoling Luo;Jibin Wu;Ammar Belatreche;Siqi Cai;Yang Yang;Haizhou Li","doi":"10.1109/TNNLS.2025.3543673","DOIUrl":null,"url":null,"abstract":"The brain is able to acquire and store memories of everyday experiences in real-time. It can also selectively forget information to facilitate memory updating. However, our understanding of the underlying mechanisms and coordination of these processes within the brain remains limited. However, no existing artificial intelligence models have yet matched human-level capabilities in terms of memory storage and retrieval. This study introduces a brain-inspired spiking neural model that integrates the learning and forgetting processes of sequential memory. The proposed model closely mimics the distributed and sparse temporal coding observed in the biological neural system. It employs one-shot online learning for memory formation and uses biologically plausible mechanisms of neural oscillation and phase precession to retrieve memorized sequences reliably. In addition, an active forgetting mechanism is integrated into the spiking neural model, enabling memory removal, flexibility, and updating. The proposed memory model not only enhances our understanding of human memory processes but also provides a robust framework for addressing temporal modeling tasks.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 6","pages":"10143-10155"},"PeriodicalIF":8.9000,"publicationDate":"2025-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10924795/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

The brain is able to acquire and store memories of everyday experiences in real-time. It can also selectively forget information to facilitate memory updating. However, our understanding of the underlying mechanisms and coordination of these processes within the brain remains limited. However, no existing artificial intelligence models have yet matched human-level capabilities in terms of memory storage and retrieval. This study introduces a brain-inspired spiking neural model that integrates the learning and forgetting processes of sequential memory. The proposed model closely mimics the distributed and sparse temporal coding observed in the biological neural system. It employs one-shot online learning for memory formation and uses biologically plausible mechanisms of neural oscillation and phase precession to retrieve memorized sequences reliably. In addition, an active forgetting mechanism is integrated into the spiking neural model, enabling memory removal, flexibility, and updating. The proposed memory model not only enhances our understanding of human memory processes but also provides a robust framework for addressing temporal modeling tasks.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用大脑激发的脉冲神经模型构建类似人类的顺序记忆。
大脑能够实时获取和存储日常经验的记忆。它还能选择性地遗忘信息,以促进记忆更新。然而,我们对大脑内这些过程的基本机制和协调的了解仍然有限。然而,现有的人工智能模型在记忆存储和检索方面还没有达到人类水平的能力。本研究介绍了一种大脑启发的尖峰神经模型,该模型整合了顺序记忆的学习和遗忘过程。该模型密切模仿了在生物神经系统中观察到的分布式稀疏时间编码。它采用单次在线学习来形成记忆,并利用生物学上可信的神经振荡和相位前冲机制来可靠地检索记忆序列。此外,在尖峰神经模型中还集成了主动遗忘机制,从而实现了记忆的删除、灵活性和更新。所提出的记忆模型不仅增强了我们对人类记忆过程的理解,还为解决时间建模任务提供了一个强大的框架。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE transactions on neural networks and learning systems
IEEE transactions on neural networks and learning systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
CiteScore
23.80
自引率
9.60%
发文量
2102
审稿时长
3-8 weeks
期刊介绍: The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.
期刊最新文献
Physical Parameter Dependencies in Mechanical Reservoir Computing: Structural Analysis, Actuation, and Improved Processing. Robust Physics-Based Deep MRI Reconstruction via Diffusion Purification. Seeing Clearly and Detecting Precisely: Perceptual Enhancement and Focus Calibration for Small-Object Detection. Open Set Domain Adaptation via Known Joint Distribution Matching and Unknown Classification Risk Reformulation. GHAttack: Generative Adversarial Attacks on Heterogeneous Graph Neural Networks.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1