Hadi Ghaemi , Zakieh Alizadehsani , Amin Shahraki , Juan M. Corchado
{"title":"Transformers in source code generation: A comprehensive survey","authors":"Hadi Ghaemi , Zakieh Alizadehsani , Amin Shahraki , Juan M. Corchado","doi":"10.1016/j.sysarc.2024.103193","DOIUrl":null,"url":null,"abstract":"<div><p>Transformers have revolutionized natural language processing (NLP) and have had a huge impact on automating tasks. Recently, transformers have led to the development of powerful large language models (LLMs), which have advanced automatic code generation. This study provides a review of code generation concepts and transformer applications in this field. First, the fundamental concepts of the attention mechanism embedded into transformers are explored. Then, predominant automated code generation approaches are briefly reviewed, including non-learning code generation (e.g., rule-based), shallow learning (e.g., heuristic rules, grammar-based), and deep learning models. Afterward, this survey reviews pre-training and fine-tuning techniques for code generation, focusing on the application of efficient transformer methods such as parameter-efficient tuning, instruction tuning, and prompt tuning. Additionally, this work briefly outlines resources for code generation (e.g., datasets, benchmarks, packages) and evaluation metrics utilized in code generation processes. Finally, the challenges and potential research directions (e.g., multimodal learning) are investigated in depth.</p></div>","PeriodicalId":50027,"journal":{"name":"Journal of Systems Architecture","volume":"153 ","pages":"Article 103193"},"PeriodicalIF":3.7000,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Systems Architecture","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1383762124001309","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0
Abstract
Transformers have revolutionized natural language processing (NLP) and have had a huge impact on automating tasks. Recently, transformers have led to the development of powerful large language models (LLMs), which have advanced automatic code generation. This study provides a review of code generation concepts and transformer applications in this field. First, the fundamental concepts of the attention mechanism embedded into transformers are explored. Then, predominant automated code generation approaches are briefly reviewed, including non-learning code generation (e.g., rule-based), shallow learning (e.g., heuristic rules, grammar-based), and deep learning models. Afterward, this survey reviews pre-training and fine-tuning techniques for code generation, focusing on the application of efficient transformer methods such as parameter-efficient tuning, instruction tuning, and prompt tuning. Additionally, this work briefly outlines resources for code generation (e.g., datasets, benchmarks, packages) and evaluation metrics utilized in code generation processes. Finally, the challenges and potential research directions (e.g., multimodal learning) are investigated in depth.
期刊介绍:
The Journal of Systems Architecture: Embedded Software Design (JSA) is a journal covering all design and architectural aspects related to embedded systems and software. It ranges from the microarchitecture level via the system software level up to the application-specific architecture level. Aspects such as real-time systems, operating systems, FPGA programming, programming languages, communications (limited to analysis and the software stack), mobile systems, parallel and distributed architectures as well as additional subjects in the computer and system architecture area will fall within the scope of this journal. Technology will not be a main focus, but its use and relevance to particular designs will be. Case studies are welcome but must contribute more than just a design for a particular piece of software.
Design automation of such systems including methodologies, techniques and tools for their design as well as novel designs of software components fall within the scope of this journal. Novel applications that use embedded systems are also central in this journal. While hardware is not a part of this journal hardware/software co-design methods that consider interplay between software and hardware components with and emphasis on software are also relevant here.