David Patterson, Joseph Gonzalez, Urs Hölzle, Quoc Le, Chen Liang, Lluis-Miquel Munguia, Daniel Rothchild, David So, Maud Texier, Jeff Dean
{"title":"机器学习训练的碳足迹将趋于平稳,然后缩小","authors":"David Patterson, Joseph Gonzalez, Urs Hölzle, Quoc Le, Chen Liang, Lluis-Miquel Munguia, Daniel Rothchild, David So, Maud Texier, Jeff Dean","doi":"arxiv-2204.05149","DOIUrl":null,"url":null,"abstract":"Machine Learning (ML) workloads have rapidly grown in importance, but raised\nconcerns about their carbon footprint. Four best practices can reduce ML\ntraining energy by up to 100x and CO2 emissions up to 1000x. By following best\npractices, overall ML energy use (across research, development, and production)\nheld steady at <15% of Google's total energy use for the past three years. If\nthe whole ML field were to adopt best practices, total carbon emissions from\ntraining would reduce. Hence, we recommend that ML papers include emissions\nexplicitly to foster competition on more than just model quality. Estimates of\nemissions in papers that omitted them have been off 100x-100,000x, so\npublishing emissions has the added benefit of ensuring accurate accounting.\nGiven the importance of climate change, we must get the numbers right to make\ncertain that we work on its biggest challenges.","PeriodicalId":501533,"journal":{"name":"arXiv - CS - General Literature","volume":"36 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink\",\"authors\":\"David Patterson, Joseph Gonzalez, Urs Hölzle, Quoc Le, Chen Liang, Lluis-Miquel Munguia, Daniel Rothchild, David So, Maud Texier, Jeff Dean\",\"doi\":\"arxiv-2204.05149\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Machine Learning (ML) workloads have rapidly grown in importance, but raised\\nconcerns about their carbon footprint. Four best practices can reduce ML\\ntraining energy by up to 100x and CO2 emissions up to 1000x. By following best\\npractices, overall ML energy use (across research, development, and production)\\nheld steady at <15% of Google's total energy use for the past three years. If\\nthe whole ML field were to adopt best practices, total carbon emissions from\\ntraining would reduce. Hence, we recommend that ML papers include emissions\\nexplicitly to foster competition on more than just model quality. Estimates of\\nemissions in papers that omitted them have been off 100x-100,000x, so\\npublishing emissions has the added benefit of ensuring accurate accounting.\\nGiven the importance of climate change, we must get the numbers right to make\\ncertain that we work on its biggest challenges.\",\"PeriodicalId\":501533,\"journal\":{\"name\":\"arXiv - CS - General Literature\",\"volume\":\"36 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-04-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - General Literature\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2204.05149\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - General Literature","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2204.05149","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink
Machine Learning (ML) workloads have rapidly grown in importance, but raised
concerns about their carbon footprint. Four best practices can reduce ML
training energy by up to 100x and CO2 emissions up to 1000x. By following best
practices, overall ML energy use (across research, development, and production)
held steady at <15% of Google's total energy use for the past three years. If
the whole ML field were to adopt best practices, total carbon emissions from
training would reduce. Hence, we recommend that ML papers include emissions
explicitly to foster competition on more than just model quality. Estimates of
emissions in papers that omitted them have been off 100x-100,000x, so
publishing emissions has the added benefit of ensuring accurate accounting.
Given the importance of climate change, we must get the numbers right to make
certain that we work on its biggest challenges.