{"title":"Energy transparency from hardware to software","authors":"K. Eder","doi":"10.1109/E3S.2013.6705855","DOIUrl":null,"url":null,"abstract":"From mobile devices to data centres, energy usage in computing continues to rise and is now a significant part of global energy consumption. Increasing the energy efficiency of computation is a major concern in electronic system engineering and high on the research agenda worldwide. While hardware can be designed to save a modest amount of energy, the potential for savings are far greater at the higher levels of abstraction in the system stack. The greatest savings are expected from energy consumption-aware software. This is because, although energy is consumed by the hardware executing computations, the control over the computation ultimately lies within the software, algorithms and data, i.e. the applications running on the hardware. Experts from Intel [1] expect software that takes full control of the energy-saving features provided by hardware can save three to five times of what conventional software is achieving. Moreover, algorithm selection is critically important - not only does the algorithm need to be the most suitable for solving the problem; it also needs to be a good fit to the hardware [2]. The challenge of energy-efficient computing, therefore, requires understanding the entire system stack, from algorithms and data, down to the computational hardware. Over the last decades, however, software engineering has been moved away from the operation of the hardware through the introduction of several layers of abstraction. While these have many benefits, including portability, increased programmer productivity, and software reuse across hardware platforms, the clear drawback is that many software engineers are now \"blissfully unaware\" of how algorithms and data, and their respective encoding, influence the energy consumption of a computation when executed on hardware.","PeriodicalId":231837,"journal":{"name":"2013 Third Berkeley Symposium on Energy Efficient Electronic Systems (E3S)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 Third Berkeley Symposium on Energy Efficient Electronic Systems (E3S)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/E3S.2013.6705855","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
From mobile devices to data centres, energy usage in computing continues to rise and is now a significant part of global energy consumption. Increasing the energy efficiency of computation is a major concern in electronic system engineering and high on the research agenda worldwide. While hardware can be designed to save a modest amount of energy, the potential for savings are far greater at the higher levels of abstraction in the system stack. The greatest savings are expected from energy consumption-aware software. This is because, although energy is consumed by the hardware executing computations, the control over the computation ultimately lies within the software, algorithms and data, i.e. the applications running on the hardware. Experts from Intel [1] expect software that takes full control of the energy-saving features provided by hardware can save three to five times of what conventional software is achieving. Moreover, algorithm selection is critically important - not only does the algorithm need to be the most suitable for solving the problem; it also needs to be a good fit to the hardware [2]. The challenge of energy-efficient computing, therefore, requires understanding the entire system stack, from algorithms and data, down to the computational hardware. Over the last decades, however, software engineering has been moved away from the operation of the hardware through the introduction of several layers of abstraction. While these have many benefits, including portability, increased programmer productivity, and software reuse across hardware platforms, the clear drawback is that many software engineers are now "blissfully unaware" of how algorithms and data, and their respective encoding, influence the energy consumption of a computation when executed on hardware.