{"title":"Combining Neural Architecture Search and Automatic Code Optimization: A Survey","authors":"Inas Bachiri, Hadjer Benmeziane, Smail Niar, Riyadh Baghdadi, Hamza Ouarnoughi, Abdelkrime Aries","doi":"arxiv-2408.04116","DOIUrl":null,"url":null,"abstract":"Deep Learning models have experienced exponential growth in complexity and\nresource demands in recent years. Accelerating these models for efficient\nexecution on resource-constrained devices has become more crucial than ever.\nTwo notable techniques employed to achieve this goal are Hardware-aware Neural\nArchitecture Search (HW-NAS) and Automatic Code Optimization (ACO). HW-NAS\nautomatically designs accurate yet hardware-friendly neural networks, while ACO\ninvolves searching for the best compiler optimizations to apply on neural\nnetworks for efficient mapping and inference on the target hardware. This\nsurvey explores recent works that combine these two techniques within a single\nframework. We present the fundamental principles of both domains and\ndemonstrate their sub-optimality when performed independently. We then\ninvestigate their integration into a joint optimization process that we call\nHardware Aware-Neural Architecture and Compiler Optimizations co-Search\n(NACOS).","PeriodicalId":501197,"journal":{"name":"arXiv - CS - Programming Languages","volume":"11 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Programming Languages","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.04116","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Deep Learning models have experienced exponential growth in complexity and
resource demands in recent years. Accelerating these models for efficient
execution on resource-constrained devices has become more crucial than ever.
Two notable techniques employed to achieve this goal are Hardware-aware Neural
Architecture Search (HW-NAS) and Automatic Code Optimization (ACO). HW-NAS
automatically designs accurate yet hardware-friendly neural networks, while ACO
involves searching for the best compiler optimizations to apply on neural
networks for efficient mapping and inference on the target hardware. This
survey explores recent works that combine these two techniques within a single
framework. We present the fundamental principles of both domains and
demonstrate their sub-optimality when performed independently. We then
investigate their integration into a joint optimization process that we call
Hardware Aware-Neural Architecture and Compiler Optimizations co-Search
(NACOS).