{"title":"高效的大型语言模型与模拟内存计算。","authors":"Anand Subramoney","doi":"10.1038/s43588-024-00760-y","DOIUrl":null,"url":null,"abstract":"A recent study demonstrates through numerical simulations that implementing large language models based on sparse mixture-of-experts architectures on 3D in-memory computing technologies can substantially reduce energy consumption.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 1","pages":"5-6"},"PeriodicalIF":12.0000,"publicationDate":"2025-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Efficient large language model with analog in-memory computing\",\"authors\":\"Anand Subramoney\",\"doi\":\"10.1038/s43588-024-00760-y\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A recent study demonstrates through numerical simulations that implementing large language models based on sparse mixture-of-experts architectures on 3D in-memory computing technologies can substantially reduce energy consumption.\",\"PeriodicalId\":74246,\"journal\":{\"name\":\"Nature computational science\",\"volume\":\"5 1\",\"pages\":\"5-6\"},\"PeriodicalIF\":12.0000,\"publicationDate\":\"2025-01-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Nature computational science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.nature.com/articles/s43588-024-00760-y\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature computational science","FirstCategoryId":"1085","ListUrlMain":"https://www.nature.com/articles/s43588-024-00760-y","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Efficient large language model with analog in-memory computing
A recent study demonstrates through numerical simulations that implementing large language models based on sparse mixture-of-experts architectures on 3D in-memory computing technologies can substantially reduce energy consumption.