{"title":"Machine Learning Hyperparameter Fine Tuning Service on Dynamic Cloud Resource Allocation System - taking Heart Sounds as an Example","authors":"Yu-Hsiang Peng, Chia-Chuan Chuang, Zhou-Jin Wu, Chia-Wei Chou, Hui-Shan Chen, Ting-Chia Chang, Yi-Lun Pan, Hsin-Tien Cheng, Chih-Chi Chung, Ken-Yu Lin","doi":"10.1145/3305275.3305280","DOIUrl":null,"url":null,"abstract":"The hyperparameters tuning of machine learning has always been a difficult and time-consuming task in deep learning area. In many practical applications, the hyperparameter tuning directly affects the accuracy. Therefore, the tuning optimization of hyperparameters is an important topic. At present, hyperparameters can only be set manually based on experience, and use Violent Enumeration, Random Search or through Grid Search to try and error, lack of effective automatic search parameters. In this study, we proposed a machine learning hyperparameter fine tuning service on dynamic cloud resource allocation system, which leverages several mainstream hyperparameter tuning methods such as Hyperopt and Optunity. In the meanwhile, various tuning methods are measured and compared by example application in this work. Finally, we dedicated actual case - Heart Sounds, and then tested it. In order to verify that the system service can not only automate the task of tuning, but also break through the limitation of the number of adjustable parameters. Furthermore the proposed hyperparameter fine tune system makes optimization process more efficient.","PeriodicalId":370976,"journal":{"name":"Proceedings of the International Symposium on Big Data and Artificial Intelligence","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-12-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the International Symposium on Big Data and Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3305275.3305280","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
The hyperparameters tuning of machine learning has always been a difficult and time-consuming task in deep learning area. In many practical applications, the hyperparameter tuning directly affects the accuracy. Therefore, the tuning optimization of hyperparameters is an important topic. At present, hyperparameters can only be set manually based on experience, and use Violent Enumeration, Random Search or through Grid Search to try and error, lack of effective automatic search parameters. In this study, we proposed a machine learning hyperparameter fine tuning service on dynamic cloud resource allocation system, which leverages several mainstream hyperparameter tuning methods such as Hyperopt and Optunity. In the meanwhile, various tuning methods are measured and compared by example application in this work. Finally, we dedicated actual case - Heart Sounds, and then tested it. In order to verify that the system service can not only automate the task of tuning, but also break through the limitation of the number of adjustable parameters. Furthermore the proposed hyperparameter fine tune system makes optimization process more efficient.