LSBO-NAS:神经结构搜索的潜在空间贝叶斯优化

Xuan Rao, Songyi Xiao, Jiaxin Li, Qiuye Wu, Bo Zhao, Derong Liu
{"title":"LSBO-NAS:神经结构搜索的潜在空间贝叶斯优化","authors":"Xuan Rao, Songyi Xiao, Jiaxin Li, Qiuye Wu, Bo Zhao, Derong Liu","doi":"10.1109/ICCR55715.2022.10053904","DOIUrl":null,"url":null,"abstract":"From the perspective of data stream, neural architecture search (NAS) can be formulated as a graph optimization problem. However, many state-of-the-art black-box optimization algorithms, such as Bayesian optimization and simulated annealing, operate in continuous space primarily, which does not match the NAS optimization due to the discreteness of graph structures. To tackle this problem, the latent space Bayesian optimization NAS (LSBO-NAS) algorithm is developed in this paper. In LSBO-NAS, the neural architectures are represented as sequences, and a variational auto-encoder (VAE) is trained to convert the discrete search space of NAS into a continuous latent space by learning the continuous representation of neural architectures. Hereafter, a Bayesian optimization (BO) algorithm, i.e., the tree-structure parzen estimator (TPE) algorithm, is developed to obtain admirable neural architectures. The optimization loop of LSBO-NAS consists of two stages. In the first stage, the BO algorithm generates a preferable architecture representation according to its search strategy. In the second stage, the decoder of VAE decodes the representation into a discrete neural architecture, whose performance evaluation is regarded as the feedback signal for the BO algorithm. The effectiveness of the developed LSBO-NAS is demonstrated on the NAS-Bench-301 benchmark, where the LSBO-NAS achieves a better performance than several NAS baselines.","PeriodicalId":441511,"journal":{"name":"2022 4th International Conference on Control and Robotics (ICCR)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"LSBO-NAS: Latent Space Bayesian Optimization for Neural Architecture Search\",\"authors\":\"Xuan Rao, Songyi Xiao, Jiaxin Li, Qiuye Wu, Bo Zhao, Derong Liu\",\"doi\":\"10.1109/ICCR55715.2022.10053904\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"From the perspective of data stream, neural architecture search (NAS) can be formulated as a graph optimization problem. However, many state-of-the-art black-box optimization algorithms, such as Bayesian optimization and simulated annealing, operate in continuous space primarily, which does not match the NAS optimization due to the discreteness of graph structures. To tackle this problem, the latent space Bayesian optimization NAS (LSBO-NAS) algorithm is developed in this paper. In LSBO-NAS, the neural architectures are represented as sequences, and a variational auto-encoder (VAE) is trained to convert the discrete search space of NAS into a continuous latent space by learning the continuous representation of neural architectures. Hereafter, a Bayesian optimization (BO) algorithm, i.e., the tree-structure parzen estimator (TPE) algorithm, is developed to obtain admirable neural architectures. The optimization loop of LSBO-NAS consists of two stages. In the first stage, the BO algorithm generates a preferable architecture representation according to its search strategy. In the second stage, the decoder of VAE decodes the representation into a discrete neural architecture, whose performance evaluation is regarded as the feedback signal for the BO algorithm. The effectiveness of the developed LSBO-NAS is demonstrated on the NAS-Bench-301 benchmark, where the LSBO-NAS achieves a better performance than several NAS baselines.\",\"PeriodicalId\":441511,\"journal\":{\"name\":\"2022 4th International Conference on Control and Robotics (ICCR)\",\"volume\":\"15 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 4th International Conference on Control and Robotics (ICCR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCR55715.2022.10053904\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 4th International Conference on Control and Robotics (ICCR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCR55715.2022.10053904","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

从数据流的角度来看,神经结构搜索(NAS)可以表述为一个图优化问题。然而,许多最先进的黑箱优化算法,如贝叶斯优化和模拟退火,主要在连续空间中运行,由于图结构的离散性,与NAS优化不匹配。为了解决这一问题,本文提出了潜在空间贝叶斯优化NAS (LSBO-NAS)算法。在LSBO-NAS中,神经结构被表示为序列,通过学习神经结构的连续表示,训练变分自编码器(VAE)将NAS的离散搜索空间转换为连续的潜在空间。在此基础上,提出了一种贝叶斯优化算法,即树结构parzen估计器(TPE)算法,以获得令人满意的神经结构。LSBO-NAS的优化循环包括两个阶段。在第一阶段,BO算法根据其搜索策略生成较优的体系结构表示。在第二阶段,VAE的解码器将表示解码成一个离散的神经结构,其性能评估作为BO算法的反馈信号。开发的LSBO-NAS的有效性在NAS- bench -301基准测试中得到了验证,其中LSBO-NAS的性能优于几个NAS基准。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
LSBO-NAS: Latent Space Bayesian Optimization for Neural Architecture Search
From the perspective of data stream, neural architecture search (NAS) can be formulated as a graph optimization problem. However, many state-of-the-art black-box optimization algorithms, such as Bayesian optimization and simulated annealing, operate in continuous space primarily, which does not match the NAS optimization due to the discreteness of graph structures. To tackle this problem, the latent space Bayesian optimization NAS (LSBO-NAS) algorithm is developed in this paper. In LSBO-NAS, the neural architectures are represented as sequences, and a variational auto-encoder (VAE) is trained to convert the discrete search space of NAS into a continuous latent space by learning the continuous representation of neural architectures. Hereafter, a Bayesian optimization (BO) algorithm, i.e., the tree-structure parzen estimator (TPE) algorithm, is developed to obtain admirable neural architectures. The optimization loop of LSBO-NAS consists of two stages. In the first stage, the BO algorithm generates a preferable architecture representation according to its search strategy. In the second stage, the decoder of VAE decodes the representation into a discrete neural architecture, whose performance evaluation is regarded as the feedback signal for the BO algorithm. The effectiveness of the developed LSBO-NAS is demonstrated on the NAS-Bench-301 benchmark, where the LSBO-NAS achieves a better performance than several NAS baselines.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Mobile Humanoid Robot Control through Object Movement Imagery Optimization of Two-end Access Platform Automated Warehouse Storage Allocation Long-Tailed Object Mining Based on CLIP Model for Autonomous Driving Node Deployment and Energy Saving Optimization Method for Wireless Sensor Networks Based on Q-learning Off-policy Q-learning-based Tracking Control for Stochastic Linear Discrete-Time Systems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1