{"title":"超级:学习用 C/C++ 生成优化源代码","authors":"Zimin Chen;Sen Fang;Martin Monperrus","doi":"10.1109/TSE.2024.3423769","DOIUrl":null,"url":null,"abstract":"Software optimization refines programs for resource efficiency while preserving functionality. Traditionally, it is a process done by developers and compilers. This paper introduces a third option, automated optimization at the source code level. We present \n<small>Supersonic</small>\n, a neural approach targeting minor source code modifications for optimization. Using a seq2seq model, \n<small>Supersonic</small>\n is trained on C/C++ program pairs (\n<inline-formula><tex-math>$x_{t}$</tex-math></inline-formula>\n, \n<inline-formula><tex-math>$x_{t+1}$</tex-math></inline-formula>\n), where \n<inline-formula><tex-math>$x_{t+1}$</tex-math></inline-formula>\n is an optimized version of \n<inline-formula><tex-math>$x_{t}$</tex-math></inline-formula>\n, and outputs a diff. \n<small>Supersonic</small>\n's performance is benchmarked against OpenAI's GPT-3.5-Turbo and GPT-4 on competitive programming tasks. The experiments show that \n<small>Supersonic</small>\n not only outperforms both models on the code optimization task but also minimizes the extent of the change with a model more than 600x smaller than GPT-3.5-Turbo and 3700x smaller than GPT-4.","PeriodicalId":13324,"journal":{"name":"IEEE Transactions on Software Engineering","volume":"50 11","pages":"2849-2864"},"PeriodicalIF":6.5000,"publicationDate":"2024-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10606318","citationCount":"0","resultStr":"{\"title\":\"Supersonic: Learning to Generate Source Code Optimizations in C/C++\",\"authors\":\"Zimin Chen;Sen Fang;Martin Monperrus\",\"doi\":\"10.1109/TSE.2024.3423769\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Software optimization refines programs for resource efficiency while preserving functionality. Traditionally, it is a process done by developers and compilers. This paper introduces a third option, automated optimization at the source code level. We present \\n<small>Supersonic</small>\\n, a neural approach targeting minor source code modifications for optimization. Using a seq2seq model, \\n<small>Supersonic</small>\\n is trained on C/C++ program pairs (\\n<inline-formula><tex-math>$x_{t}$</tex-math></inline-formula>\\n, \\n<inline-formula><tex-math>$x_{t+1}$</tex-math></inline-formula>\\n), where \\n<inline-formula><tex-math>$x_{t+1}$</tex-math></inline-formula>\\n is an optimized version of \\n<inline-formula><tex-math>$x_{t}$</tex-math></inline-formula>\\n, and outputs a diff. \\n<small>Supersonic</small>\\n's performance is benchmarked against OpenAI's GPT-3.5-Turbo and GPT-4 on competitive programming tasks. The experiments show that \\n<small>Supersonic</small>\\n not only outperforms both models on the code optimization task but also minimizes the extent of the change with a model more than 600x smaller than GPT-3.5-Turbo and 3700x smaller than GPT-4.\",\"PeriodicalId\":13324,\"journal\":{\"name\":\"IEEE Transactions on Software Engineering\",\"volume\":\"50 11\",\"pages\":\"2849-2864\"},\"PeriodicalIF\":6.5000,\"publicationDate\":\"2024-07-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10606318\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Software Engineering\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10606318/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Software Engineering","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10606318/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
Supersonic: Learning to Generate Source Code Optimizations in C/C++
Software optimization refines programs for resource efficiency while preserving functionality. Traditionally, it is a process done by developers and compilers. This paper introduces a third option, automated optimization at the source code level. We present
Supersonic
, a neural approach targeting minor source code modifications for optimization. Using a seq2seq model,
Supersonic
is trained on C/C++ program pairs (
$x_{t}$
,
$x_{t+1}$
), where
$x_{t+1}$
is an optimized version of
$x_{t}$
, and outputs a diff.
Supersonic
's performance is benchmarked against OpenAI's GPT-3.5-Turbo and GPT-4 on competitive programming tasks. The experiments show that
Supersonic
not only outperforms both models on the code optimization task but also minimizes the extent of the change with a model more than 600x smaller than GPT-3.5-Turbo and 3700x smaller than GPT-4.
期刊介绍:
IEEE Transactions on Software Engineering seeks contributions comprising well-defined theoretical results and empirical studies with potential impacts on software construction, analysis, or management. The scope of this Transactions extends from fundamental mechanisms to the development of principles and their application in specific environments. Specific topic areas include:
a) Development and maintenance methods and models: Techniques and principles for specifying, designing, and implementing software systems, encompassing notations and process models.
b) Assessment methods: Software tests, validation, reliability models, test and diagnosis procedures, software redundancy, design for error control, and measurements and evaluation of process and product aspects.
c) Software project management: Productivity factors, cost models, schedule and organizational issues, and standards.
d) Tools and environments: Specific tools, integrated tool environments, associated architectures, databases, and parallel and distributed processing issues.
e) System issues: Hardware-software trade-offs.
f) State-of-the-art surveys: Syntheses and comprehensive reviews of the historical development within specific areas of interest.