{"title":"Prospects and challenges of electrochemical random-access memory for deep-learning accelerators","authors":"Jinsong Cui , Haoran Liu , Qing Cao","doi":"10.1016/j.cossms.2024.101187","DOIUrl":null,"url":null,"abstract":"<div><p>The ever-expanding capabilities of machine learning are powered by exponentially growing complexity of deep neural network (DNN) models, requiring more energy and chip-area efficient hardware to carry out increasingly computational expensive model-inference and training tasks. Electrochemical random-access memories (ECRAMs) are developed specifically to implement efficient analog in-memory computing for these data-intensive workloads, showing some critical advantages over competing memory technologies mostly developed originally for digital electronics. ECRAMs possess the distinctive capability to switch between a very large number of memristive states with a high level of symmetry, small cycle-to-cycle variability, and low energy consumption; and they simultaneously exhibit good endurance, long data retention, fast switching speed up to nanoseconds, and verified scalability down to sub-50 nm regime, therefore holding great promise in realizing deep-learning accelerators when heterogeneously integrated with silicon-based peripheral circuits. In this review, we first examine challenges in constructing in-memory-computing accelerators and unique advantages of ECRAMs. We then critically assess the various ionic species, channel materials, and solid-state electrolytes employed in ECRAMs that influence device programming characteristics and performance metrics with their different memristive modulation and ionic transport mechanisms. Furthermore, ECRAM device engineering and integration schemes are discussed, within the context of their implementation in high-density pseudo-crossbar array microarchitectures for performing DNN inference and training with high parallelism. Finally, we offer our insights regarding major remaining obstacles and emerging opportunities of harnessing ECRAMs to realize deep-learning accelerators through material-device-circuit-architecture-algorithm co-design.</p></div>","PeriodicalId":295,"journal":{"name":"Current Opinion in Solid State & Materials Science","volume":"32 ","pages":"Article 101187"},"PeriodicalIF":12.2000,"publicationDate":"2024-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1359028624000536/pdfft?md5=33fbb3f4ff0a27b0a69d3fcaa7f064ea&pid=1-s2.0-S1359028624000536-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Current Opinion in Solid State & Materials Science","FirstCategoryId":"88","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1359028624000536","RegionNum":2,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATERIALS SCIENCE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
The ever-expanding capabilities of machine learning are powered by exponentially growing complexity of deep neural network (DNN) models, requiring more energy and chip-area efficient hardware to carry out increasingly computational expensive model-inference and training tasks. Electrochemical random-access memories (ECRAMs) are developed specifically to implement efficient analog in-memory computing for these data-intensive workloads, showing some critical advantages over competing memory technologies mostly developed originally for digital electronics. ECRAMs possess the distinctive capability to switch between a very large number of memristive states with a high level of symmetry, small cycle-to-cycle variability, and low energy consumption; and they simultaneously exhibit good endurance, long data retention, fast switching speed up to nanoseconds, and verified scalability down to sub-50 nm regime, therefore holding great promise in realizing deep-learning accelerators when heterogeneously integrated with silicon-based peripheral circuits. In this review, we first examine challenges in constructing in-memory-computing accelerators and unique advantages of ECRAMs. We then critically assess the various ionic species, channel materials, and solid-state electrolytes employed in ECRAMs that influence device programming characteristics and performance metrics with their different memristive modulation and ionic transport mechanisms. Furthermore, ECRAM device engineering and integration schemes are discussed, within the context of their implementation in high-density pseudo-crossbar array microarchitectures for performing DNN inference and training with high parallelism. Finally, we offer our insights regarding major remaining obstacles and emerging opportunities of harnessing ECRAMs to realize deep-learning accelerators through material-device-circuit-architecture-algorithm co-design.
期刊介绍:
Title: Current Opinion in Solid State & Materials Science
Journal Overview:
Aims to provide a snapshot of the latest research and advances in materials science
Publishes six issues per year, each containing reviews covering exciting and developing areas of materials science
Each issue comprises 2-3 sections of reviews commissioned by international researchers who are experts in their fields
Provides materials scientists with the opportunity to stay informed about current developments in their own and related areas of research
Promotes cross-fertilization of ideas across an increasingly interdisciplinary field