Jiantao Zhang , Bojun Ren , Yicheng Fu , Rongbo Ma , Zinuo Cai , Weishan Zhang , Ruhui Ma , Jinshan Sun
{"title":"HyperTuneFaaS: A serverless framework for hyperparameter tuning in image processing models","authors":"Jiantao Zhang , Bojun Ren , Yicheng Fu , Rongbo Ma , Zinuo Cai , Weishan Zhang , Ruhui Ma , Jinshan Sun","doi":"10.1016/j.displa.2025.102990","DOIUrl":null,"url":null,"abstract":"<div><div>Deep learning has achieved remarkable success across various fields, especially in image processing tasks like denoising, sharpening, and contrast enhancement. However, the performance of these models heavily relies on the careful selection of hyperparameters, which can be a computationally intensive and time-consuming task. Cloud-based hyperparameter search methods have gained popularity due to their ability to address the inefficiencies of single-machine training and the underutilization of computing resources. Nevertheless, these methods still encounters substantial challenges, including high computational demands, parallelism requirements, and prolonged search time.</div><div>In this study, we propose <span>HyperTuneFaaS</span>, a Function as a Service (FaaS)-based hyperparameter search framework that leverages distributed computing and asynchronous processing to tackle the issues encountered in hyperparameter search. By fully exploiting the parallelism offered by serverless computing, <span>HyperTuneFaaS</span> minimizes the overhead typically associated with model training on serverless platforms. Additionally, we enhance the traditional genetic algorithm, a powerful metaheuristic method, to improve its efficiency and integrate it with the framework to enhance the efficiency of hyperparameter tuning. Experimental results demonstrate significant improvements in efficiency and cost savings with the combination of the FaaS-based hyperparameter tuning framework and the optimized genetic algorithm, making <span>HyperTuneFaaS</span> a powerful tool for optimizing image processing models and achieving superior image quality.</div></div>","PeriodicalId":50570,"journal":{"name":"Displays","volume":"87 ","pages":"Article 102990"},"PeriodicalIF":3.7000,"publicationDate":"2025-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Displays","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0141938225000277","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0
Abstract
Deep learning has achieved remarkable success across various fields, especially in image processing tasks like denoising, sharpening, and contrast enhancement. However, the performance of these models heavily relies on the careful selection of hyperparameters, which can be a computationally intensive and time-consuming task. Cloud-based hyperparameter search methods have gained popularity due to their ability to address the inefficiencies of single-machine training and the underutilization of computing resources. Nevertheless, these methods still encounters substantial challenges, including high computational demands, parallelism requirements, and prolonged search time.
In this study, we propose HyperTuneFaaS, a Function as a Service (FaaS)-based hyperparameter search framework that leverages distributed computing and asynchronous processing to tackle the issues encountered in hyperparameter search. By fully exploiting the parallelism offered by serverless computing, HyperTuneFaaS minimizes the overhead typically associated with model training on serverless platforms. Additionally, we enhance the traditional genetic algorithm, a powerful metaheuristic method, to improve its efficiency and integrate it with the framework to enhance the efficiency of hyperparameter tuning. Experimental results demonstrate significant improvements in efficiency and cost savings with the combination of the FaaS-based hyperparameter tuning framework and the optimized genetic algorithm, making HyperTuneFaaS a powerful tool for optimizing image processing models and achieving superior image quality.
期刊介绍:
Displays is the international journal covering the research and development of display technology, its effective presentation and perception of information, and applications and systems including display-human interface.
Technical papers on practical developments in Displays technology provide an effective channel to promote greater understanding and cross-fertilization across the diverse disciplines of the Displays community. Original research papers solving ergonomics issues at the display-human interface advance effective presentation of information. Tutorial papers covering fundamentals intended for display technologies and human factor engineers new to the field will also occasionally featured.