Zhenyao Zhao, Guangbin Zhang, Min Jiang, Liang Feng, K. Tan
{"title":"基于分布估计的高效神经网络结构设计","authors":"Zhenyao Zhao, Guangbin Zhang, Min Jiang, Liang Feng, K. Tan","doi":"10.1109/IAI50351.2020.9262190","DOIUrl":null,"url":null,"abstract":"Neural architecture search (NAS) is the process of automatically searching for the best performing neural model on a given task. Designing a neural model requires a lot of time for experts, NAS's automated process effectively solves this problem and makes neural networks easier to promote. Although NAS has achieved excellent performance, its search process is still very time consuming. In this paper, we propose a neural architecture design method based on distribution estimation method called EDNAS, a fast and economical solution to design neural architecture automatically. In EDNAS, we assume that the best performing architecture obeys a certain probability distribution in search space. Therefore, NAS can be transformed to learning this probability distribution. We construct a probability model on the search space, and search for this probability distribution by iterating the probability model. Finally, an architecture that maximizes the performance on a validation set is generated from this probability distribution. Experiment shows the efficiency of our method. On CIFAR-10 dataset, EDNAS discovers a novel architecture in just 4 hours with 2.89% test error, which shows efficent and strong performance.","PeriodicalId":137183,"journal":{"name":"2020 2nd International Conference on Industrial Artificial Intelligence (IAI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"EDNAS: An Efficient Neural Architecture Design based on Distribution Estimation\",\"authors\":\"Zhenyao Zhao, Guangbin Zhang, Min Jiang, Liang Feng, K. Tan\",\"doi\":\"10.1109/IAI50351.2020.9262190\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Neural architecture search (NAS) is the process of automatically searching for the best performing neural model on a given task. Designing a neural model requires a lot of time for experts, NAS's automated process effectively solves this problem and makes neural networks easier to promote. Although NAS has achieved excellent performance, its search process is still very time consuming. In this paper, we propose a neural architecture design method based on distribution estimation method called EDNAS, a fast and economical solution to design neural architecture automatically. In EDNAS, we assume that the best performing architecture obeys a certain probability distribution in search space. Therefore, NAS can be transformed to learning this probability distribution. We construct a probability model on the search space, and search for this probability distribution by iterating the probability model. Finally, an architecture that maximizes the performance on a validation set is generated from this probability distribution. Experiment shows the efficiency of our method. On CIFAR-10 dataset, EDNAS discovers a novel architecture in just 4 hours with 2.89% test error, which shows efficent and strong performance.\",\"PeriodicalId\":137183,\"journal\":{\"name\":\"2020 2nd International Conference on Industrial Artificial Intelligence (IAI)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-10-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 2nd International Conference on Industrial Artificial Intelligence (IAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IAI50351.2020.9262190\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 2nd International Conference on Industrial Artificial Intelligence (IAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IAI50351.2020.9262190","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
EDNAS: An Efficient Neural Architecture Design based on Distribution Estimation
Neural architecture search (NAS) is the process of automatically searching for the best performing neural model on a given task. Designing a neural model requires a lot of time for experts, NAS's automated process effectively solves this problem and makes neural networks easier to promote. Although NAS has achieved excellent performance, its search process is still very time consuming. In this paper, we propose a neural architecture design method based on distribution estimation method called EDNAS, a fast and economical solution to design neural architecture automatically. In EDNAS, we assume that the best performing architecture obeys a certain probability distribution in search space. Therefore, NAS can be transformed to learning this probability distribution. We construct a probability model on the search space, and search for this probability distribution by iterating the probability model. Finally, an architecture that maximizes the performance on a validation set is generated from this probability distribution. Experiment shows the efficiency of our method. On CIFAR-10 dataset, EDNAS discovers a novel architecture in just 4 hours with 2.89% test error, which shows efficent and strong performance.