{"title":"Materials, Physics, and Chemistry of Neuromorphic Computing Systems","authors":"Juan Bisquert","doi":"10.1021/acs.jpclett.4c03033","DOIUrl":null,"url":null,"abstract":"Published as part of <i>The Journal of Physical Chemistry Letters</i> special issue “Materials, Physics, and Chemistry of Neuromorphic Computing Systems”. In the era of artificial intelligence (AI), the rapid growth of unstructured data has created an urgent need for efficient, high-speed data processing and analysis. Traditional computing systems, rooted in the von Neumann architecture, struggle to keep pace due to inherent limitations, including restricted computational speed and increasing energy consumption. These challenges stem from the separation of processing and memory units, a problem known as the von Neumann bottleneck. To address these issues, researchers have turned to neuromorphic computing, which draws inspiration from the brain’s ability to perform parallel, energy-efficient operations with remarkable processing power and adaptability. Within the field of AI, notable examples of brain-inspired advances include artificial neural networks (ANNs) and deep learning (DL) neural networks, which are ANNs with several layers that lend themselves to learned feature representations. These have surpassed humans on many tasks such as pattern recognition, game playing, machine translation, and more. The algorithms are adapted to an ever-increasing range of machine learning (ML) tasks. Spiking neural networks (SNN) are of high current interest, both from the perspective of modeling neural networks of the brain and for exporting their fast-learning capability and energy efficiency into neuromorphic hardware. The goal of neuromorphic computational systems is a powerful advancement in technology that allows devices to gather data, analyze it in real time, and autonomously take actions based on the information received. A similar concept can be found in sensor computing, where sensors not only detect stimuli but also perform data conversion and processing at the point of data collection. This capability, known as in-sensor computing, reduces the need for extensive data transfer and system complexity, allowing connected devices to process information and make decisions locally─at the edge─rather than relying on a centralized system. By enabling faster, more intelligent decision-making at the edge, neuromorphic computing can transform industries and pave the way for a more connected, efficient, and intelligent future across numerous sectors, including healthcare, agriculture, manufacturing, and smart cities. Here we present the special issue Materials, Physics and Chemistry of Neuromorphic Computing Systems. There is considerable interest in attaining memory and computation functionalities based on a neurological understanding of physical and chemical phenomena, faithfully replicated in suitable devices, by detailed control of materials and surface properties at the micro- and nanoscale. Such types of functionalities can be defined by the physical chemistry analysis of different materials properties, to reproduce biological properties such as synaptic plasticity and develop network dynamics responsible for fast learning capabilities of the brain, exploiting different stimuli such as electricity and light, in soft, hard and liquid media, for edge computation in different environments. This issue contains a representative set of materials platforms, including the dominant fields of halide perovskites, organic materials, metal oxides, fluidic systems, and ferroelectrics. Many papers are devoted to the synaptical and resistive switching properties of electrical memristors and transistors. Several papers combine electrical and optical properties, or use entirely optical signals, for synapsis and learning functions. Others explore the application at the network level, in ML with neural network circuits and in reservoir computing. The research and perspective papers in this issue show that the neuromorphic materials and devices area is a challenging topic where delicate phenomena as filamentary formation and phase transitions must operate in robust frameworks, providing different levels of complexity, in the response to stimuli for information processing. While many advances and examples have been reported in recent years, the gate is open for significant innovation and an enormous span of applications. This article has not yet been cited by other publications.","PeriodicalId":62,"journal":{"name":"The Journal of Physical Chemistry Letters","volume":"11 1","pages":""},"PeriodicalIF":4.6000,"publicationDate":"2025-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Journal of Physical Chemistry Letters","FirstCategoryId":"1","ListUrlMain":"https://doi.org/10.1021/acs.jpclett.4c03033","RegionNum":2,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, PHYSICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Published as part of The Journal of Physical Chemistry Letters special issue “Materials, Physics, and Chemistry of Neuromorphic Computing Systems”. In the era of artificial intelligence (AI), the rapid growth of unstructured data has created an urgent need for efficient, high-speed data processing and analysis. Traditional computing systems, rooted in the von Neumann architecture, struggle to keep pace due to inherent limitations, including restricted computational speed and increasing energy consumption. These challenges stem from the separation of processing and memory units, a problem known as the von Neumann bottleneck. To address these issues, researchers have turned to neuromorphic computing, which draws inspiration from the brain’s ability to perform parallel, energy-efficient operations with remarkable processing power and adaptability. Within the field of AI, notable examples of brain-inspired advances include artificial neural networks (ANNs) and deep learning (DL) neural networks, which are ANNs with several layers that lend themselves to learned feature representations. These have surpassed humans on many tasks such as pattern recognition, game playing, machine translation, and more. The algorithms are adapted to an ever-increasing range of machine learning (ML) tasks. Spiking neural networks (SNN) are of high current interest, both from the perspective of modeling neural networks of the brain and for exporting their fast-learning capability and energy efficiency into neuromorphic hardware. The goal of neuromorphic computational systems is a powerful advancement in technology that allows devices to gather data, analyze it in real time, and autonomously take actions based on the information received. A similar concept can be found in sensor computing, where sensors not only detect stimuli but also perform data conversion and processing at the point of data collection. This capability, known as in-sensor computing, reduces the need for extensive data transfer and system complexity, allowing connected devices to process information and make decisions locally─at the edge─rather than relying on a centralized system. By enabling faster, more intelligent decision-making at the edge, neuromorphic computing can transform industries and pave the way for a more connected, efficient, and intelligent future across numerous sectors, including healthcare, agriculture, manufacturing, and smart cities. Here we present the special issue Materials, Physics and Chemistry of Neuromorphic Computing Systems. There is considerable interest in attaining memory and computation functionalities based on a neurological understanding of physical and chemical phenomena, faithfully replicated in suitable devices, by detailed control of materials and surface properties at the micro- and nanoscale. Such types of functionalities can be defined by the physical chemistry analysis of different materials properties, to reproduce biological properties such as synaptic plasticity and develop network dynamics responsible for fast learning capabilities of the brain, exploiting different stimuli such as electricity and light, in soft, hard and liquid media, for edge computation in different environments. This issue contains a representative set of materials platforms, including the dominant fields of halide perovskites, organic materials, metal oxides, fluidic systems, and ferroelectrics. Many papers are devoted to the synaptical and resistive switching properties of electrical memristors and transistors. Several papers combine electrical and optical properties, or use entirely optical signals, for synapsis and learning functions. Others explore the application at the network level, in ML with neural network circuits and in reservoir computing. The research and perspective papers in this issue show that the neuromorphic materials and devices area is a challenging topic where delicate phenomena as filamentary formation and phase transitions must operate in robust frameworks, providing different levels of complexity, in the response to stimuli for information processing. While many advances and examples have been reported in recent years, the gate is open for significant innovation and an enormous span of applications. This article has not yet been cited by other publications.
期刊介绍:
The Journal of Physical Chemistry (JPC) Letters is devoted to reporting new and original experimental and theoretical basic research of interest to physical chemists, biophysical chemists, chemical physicists, physicists, material scientists, and engineers. An important criterion for acceptance is that the paper reports a significant scientific advance and/or physical insight such that rapid publication is essential. Two issues of JPC Letters are published each month.