Toward Switching and Fusing Neuromorphic Computing: Vertical Bulk Heterojunction Transistors with Multi-Neuromorphic Functions for Efficient Deep Learning
Yi Zou, Di Liu, Xinyan Gan, Rengjian Yu, Xianghong Zhang, Chansong Gao, Zhenjia Chen, Chenhui Xu, Yun Ye, Yuanyuan Hu, Tailiang Guo, Huipeng Chen
{"title":"Toward Switching and Fusing Neuromorphic Computing: Vertical Bulk Heterojunction Transistors with Multi-Neuromorphic Functions for Efficient Deep Learning","authors":"Yi Zou, Di Liu, Xinyan Gan, Rengjian Yu, Xianghong Zhang, Chansong Gao, Zhenjia Chen, Chenhui Xu, Yun Ye, Yuanyuan Hu, Tailiang Guo, Huipeng Chen","doi":"10.1002/adma.202419245","DOIUrl":null,"url":null,"abstract":"<p>The combination of artificial neural networks (ANN) and spiking neural networks (SNN) holds great promise for advancing artificial general intelligence (AGI). However, the reported ANN and SNN computational architectures are independent and require a large number of auxiliary circuits and external algorithms for fusion training. Here, a novel vertical bulk heterojunction neuromorphic transistor (VHNT) capable of emulating both ANN and SNN computational functions is presented. TaO<sub>x</sub>-based electrochemical reactions and PDVT-10/N2200-based bulk heterojunctions are used to realize spike coding and voltage coding, respectively. Notably, the device exhibits remarkable efficiency, consuming a mere 0.84 nJ of energy consumption for a single multiply accumulate (MAC) operation with excellent linearity. Moreover, the device can be switched to spiking neuron and self-activation neuron by simply changing the programming without auxiliary circuits. Finally, the VHNT-based artificial spiking neural network (ASNN) fusion simulation architecture is demonstrated, achieving 95% accuracy for Canadian-Institute-For-Advanced-ResearchResearch-10 (CIFARResearch-10) dataset while significantly enhancing training speed and efficiency. This work proposes a novel device strategy for developing high-performance, low-power, and environmentally adaptive AGI.</p>","PeriodicalId":114,"journal":{"name":"Advanced Materials","volume":"37 27","pages":""},"PeriodicalIF":26.8000,"publicationDate":"2025-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advanced Materials","FirstCategoryId":"88","ListUrlMain":"https://advanced.onlinelibrary.wiley.com/doi/10.1002/adma.202419245","RegionNum":1,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
The combination of artificial neural networks (ANN) and spiking neural networks (SNN) holds great promise for advancing artificial general intelligence (AGI). However, the reported ANN and SNN computational architectures are independent and require a large number of auxiliary circuits and external algorithms for fusion training. Here, a novel vertical bulk heterojunction neuromorphic transistor (VHNT) capable of emulating both ANN and SNN computational functions is presented. TaOx-based electrochemical reactions and PDVT-10/N2200-based bulk heterojunctions are used to realize spike coding and voltage coding, respectively. Notably, the device exhibits remarkable efficiency, consuming a mere 0.84 nJ of energy consumption for a single multiply accumulate (MAC) operation with excellent linearity. Moreover, the device can be switched to spiking neuron and self-activation neuron by simply changing the programming without auxiliary circuits. Finally, the VHNT-based artificial spiking neural network (ASNN) fusion simulation architecture is demonstrated, achieving 95% accuracy for Canadian-Institute-For-Advanced-ResearchResearch-10 (CIFARResearch-10) dataset while significantly enhancing training speed and efficiency. This work proposes a novel device strategy for developing high-performance, low-power, and environmentally adaptive AGI.
期刊介绍:
Advanced Materials, one of the world's most prestigious journals and the foundation of the Advanced portfolio, is the home of choice for best-in-class materials science for more than 30 years. Following this fast-growing and interdisciplinary field, we are considering and publishing the most important discoveries on any and all materials from materials scientists, chemists, physicists, engineers as well as health and life scientists and bringing you the latest results and trends in modern materials-related research every week.