Materials, Physics, and Chemistry of Neuromorphic Computing Systems

IF 4.6 2区 化学 Q2 CHEMISTRY, PHYSICAL The Journal of Physical Chemistry Letters Pub Date : 2025-04-21 DOI:10.1021/acs.jpclett.4c03033
Juan Bisquert
{"title":"Materials, Physics, and Chemistry of Neuromorphic Computing Systems","authors":"Juan Bisquert","doi":"10.1021/acs.jpclett.4c03033","DOIUrl":null,"url":null,"abstract":"Published as part of <i>The Journal of Physical Chemistry Letters</i> special issue “Materials, Physics, and Chemistry of Neuromorphic Computing Systems”. In the era of artificial intelligence (AI), the rapid growth of unstructured data has created an urgent need for efficient, high-speed data processing and analysis. Traditional computing systems, rooted in the von Neumann architecture, struggle to keep pace due to inherent limitations, including restricted computational speed and increasing energy consumption. These challenges stem from the separation of processing and memory units, a problem known as the von Neumann bottleneck. To address these issues, researchers have turned to neuromorphic computing, which draws inspiration from the brain’s ability to perform parallel, energy-efficient operations with remarkable processing power and adaptability. Within the field of AI, notable examples of brain-inspired advances include artificial neural networks (ANNs) and deep learning (DL) neural networks, which are ANNs with several layers that lend themselves to learned feature representations. These have surpassed humans on many tasks such as pattern recognition, game playing, machine translation, and more. The algorithms are adapted to an ever-increasing range of machine learning (ML) tasks. Spiking neural networks (SNN) are of high current interest, both from the perspective of modeling neural networks of the brain and for exporting their fast-learning capability and energy efficiency into neuromorphic hardware. The goal of neuromorphic computational systems is a powerful advancement in technology that allows devices to gather data, analyze it in real time, and autonomously take actions based on the information received. A similar concept can be found in sensor computing, where sensors not only detect stimuli but also perform data conversion and processing at the point of data collection. This capability, known as in-sensor computing, reduces the need for extensive data transfer and system complexity, allowing connected devices to process information and make decisions locally─at the edge─rather than relying on a centralized system. By enabling faster, more intelligent decision-making at the edge, neuromorphic computing can transform industries and pave the way for a more connected, efficient, and intelligent future across numerous sectors, including healthcare, agriculture, manufacturing, and smart cities. Here we present the special issue Materials, Physics and Chemistry of Neuromorphic Computing Systems. There is considerable interest in attaining memory and computation functionalities based on a neurological understanding of physical and chemical phenomena, faithfully replicated in suitable devices, by detailed control of materials and surface properties at the micro- and nanoscale. Such types of functionalities can be defined by the physical chemistry analysis of different materials properties, to reproduce biological properties such as synaptic plasticity and develop network dynamics responsible for fast learning capabilities of the brain, exploiting different stimuli such as electricity and light, in soft, hard and liquid media, for edge computation in different environments. This issue contains a representative set of materials platforms, including the dominant fields of halide perovskites, organic materials, metal oxides, fluidic systems, and ferroelectrics. Many papers are devoted to the synaptical and resistive switching properties of electrical memristors and transistors. Several papers combine electrical and optical properties, or use entirely optical signals, for synapsis and learning functions. Others explore the application at the network level, in ML with neural network circuits and in reservoir computing. The research and perspective papers in this issue show that the neuromorphic materials and devices area is a challenging topic where delicate phenomena as filamentary formation and phase transitions must operate in robust frameworks, providing different levels of complexity, in the response to stimuli for information processing. While many advances and examples have been reported in recent years, the gate is open for significant innovation and an enormous span of applications. This article has not yet been cited by other publications.","PeriodicalId":62,"journal":{"name":"The Journal of Physical Chemistry Letters","volume":"11 1","pages":""},"PeriodicalIF":4.6000,"publicationDate":"2025-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Journal of Physical Chemistry Letters","FirstCategoryId":"1","ListUrlMain":"https://doi.org/10.1021/acs.jpclett.4c03033","RegionNum":2,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, PHYSICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Published as part of The Journal of Physical Chemistry Letters special issue “Materials, Physics, and Chemistry of Neuromorphic Computing Systems”. In the era of artificial intelligence (AI), the rapid growth of unstructured data has created an urgent need for efficient, high-speed data processing and analysis. Traditional computing systems, rooted in the von Neumann architecture, struggle to keep pace due to inherent limitations, including restricted computational speed and increasing energy consumption. These challenges stem from the separation of processing and memory units, a problem known as the von Neumann bottleneck. To address these issues, researchers have turned to neuromorphic computing, which draws inspiration from the brain’s ability to perform parallel, energy-efficient operations with remarkable processing power and adaptability. Within the field of AI, notable examples of brain-inspired advances include artificial neural networks (ANNs) and deep learning (DL) neural networks, which are ANNs with several layers that lend themselves to learned feature representations. These have surpassed humans on many tasks such as pattern recognition, game playing, machine translation, and more. The algorithms are adapted to an ever-increasing range of machine learning (ML) tasks. Spiking neural networks (SNN) are of high current interest, both from the perspective of modeling neural networks of the brain and for exporting their fast-learning capability and energy efficiency into neuromorphic hardware. The goal of neuromorphic computational systems is a powerful advancement in technology that allows devices to gather data, analyze it in real time, and autonomously take actions based on the information received. A similar concept can be found in sensor computing, where sensors not only detect stimuli but also perform data conversion and processing at the point of data collection. This capability, known as in-sensor computing, reduces the need for extensive data transfer and system complexity, allowing connected devices to process information and make decisions locally─at the edge─rather than relying on a centralized system. By enabling faster, more intelligent decision-making at the edge, neuromorphic computing can transform industries and pave the way for a more connected, efficient, and intelligent future across numerous sectors, including healthcare, agriculture, manufacturing, and smart cities. Here we present the special issue Materials, Physics and Chemistry of Neuromorphic Computing Systems. There is considerable interest in attaining memory and computation functionalities based on a neurological understanding of physical and chemical phenomena, faithfully replicated in suitable devices, by detailed control of materials and surface properties at the micro- and nanoscale. Such types of functionalities can be defined by the physical chemistry analysis of different materials properties, to reproduce biological properties such as synaptic plasticity and develop network dynamics responsible for fast learning capabilities of the brain, exploiting different stimuli such as electricity and light, in soft, hard and liquid media, for edge computation in different environments. This issue contains a representative set of materials platforms, including the dominant fields of halide perovskites, organic materials, metal oxides, fluidic systems, and ferroelectrics. Many papers are devoted to the synaptical and resistive switching properties of electrical memristors and transistors. Several papers combine electrical and optical properties, or use entirely optical signals, for synapsis and learning functions. Others explore the application at the network level, in ML with neural network circuits and in reservoir computing. The research and perspective papers in this issue show that the neuromorphic materials and devices area is a challenging topic where delicate phenomena as filamentary formation and phase transitions must operate in robust frameworks, providing different levels of complexity, in the response to stimuli for information processing. While many advances and examples have been reported in recent years, the gate is open for significant innovation and an enormous span of applications. This article has not yet been cited by other publications.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
神经形态计算系统的材料、物理和化学
作为《物理化学快报》特刊“神经形态计算系统的材料、物理和化学”的一部分发表。在人工智能(AI)时代,非结构化数据的快速增长产生了对高效、高速数据处理和分析的迫切需求。基于冯·诺伊曼架构的传统计算系统,由于固有的限制,包括有限的计算速度和不断增加的能耗,难以跟上时代的步伐。这些挑战源于处理和存储单元的分离,这是一个被称为冯·诺伊曼瓶颈的问题。为了解决这些问题,研究人员转向神经形态计算(neuromorphic computing),它从大脑执行并行、节能操作的能力中获得灵感,并具有卓越的处理能力和适应性。在人工智能领域,大脑启发的进步的著名例子包括人工神经网络(ann)和深度学习(DL)神经网络,它们是具有多层的人工神经网络,可用于学习特征表示。它们在许多任务上都超越了人类,比如模式识别、游戏、机器翻译等等。这些算法适用于越来越多的机器学习(ML)任务。无论是从大脑神经网络建模的角度,还是从将其快速学习能力和能量效率输出到神经形态硬件的角度来看,峰值神经网络(SNN)都是当前备受关注的问题。神经形态计算系统的目标是在技术上取得强大的进步,使设备能够收集数据,实时分析数据,并根据接收到的信息自主采取行动。类似的概念可以在传感器计算中找到,传感器不仅检测刺激,而且在数据收集点执行数据转换和处理。这种被称为传感器内计算(in-sensor computing)的功能减少了对大量数据传输和系统复杂性的需求,使联网设备能够在边缘处理信息并在本地做出决策,而不是依赖于集中式系统。通过在边缘实现更快、更智能的决策,神经形态计算可以改变行业,并为包括医疗保健、农业、制造业和智能城市在内的众多行业实现更紧密、更高效和更智能的未来铺平道路。在这里,我们介绍了特刊神经形态计算系统的材料、物理和化学。基于对物理和化学现象的神经学理解,通过在微观和纳米尺度上对材料和表面特性的详细控制,在合适的设备中忠实地复制,从而获得记忆和计算功能,这是一个相当大的兴趣。这些类型的功能可以通过对不同材料特性的物理化学分析来定义,以重现生物特性,如突触可塑性,并开发负责大脑快速学习能力的网络动力学,利用不同的刺激,如电和光,在软,硬和液体介质中,在不同的环境中进行边缘计算。本刊包含一组具有代表性的材料平台,包括卤化物钙钛矿、有机材料、金属氧化物、流体系统和铁电体等主导领域。许多论文致力于研究忆阻器和晶体管的突触和电阻开关特性。有几篇论文结合了电学和光学特性,或者完全使用光学信号来实现突触和学习功能。其他人则在网络层面探索应用,在ML中使用神经网络电路和油藏计算。本期的研究和展望论文表明,神经形态材料和器件领域是一个具有挑战性的课题,其中精细的现象,如丝状形成和相变,必须在强大的框架中运作,在对信息处理刺激的反应中提供不同程度的复杂性。虽然近年来已经报道了许多进展和例子,但重大创新和广泛应用的大门是敞开的。这篇文章尚未被其他出版物引用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
The Journal of Physical Chemistry Letters
The Journal of Physical Chemistry Letters CHEMISTRY, PHYSICAL-NANOSCIENCE & NANOTECHNOLOGY
CiteScore
9.60
自引率
7.00%
发文量
1519
审稿时长
1.6 months
期刊介绍: The Journal of Physical Chemistry (JPC) Letters is devoted to reporting new and original experimental and theoretical basic research of interest to physical chemists, biophysical chemists, chemical physicists, physicists, material scientists, and engineers. An important criterion for acceptance is that the paper reports a significant scientific advance and/or physical insight such that rapid publication is essential. Two issues of JPC Letters are published each month.
期刊最新文献
Issue Publication Information Issue Editorial Masthead Clar's Rule Reveals the Topological Origin of Edge States in π-Conjugated Systems. Divergent Transformation Pathways from a Common Au25 Intermediate: The Effect of "Positional Isomerism". Learning Molecular Conformational Energies Using Semilocal Density Fingerprints.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1