Uday Kulkarni, Meena S M, Raghavendra A Hallyal, Prasanna H Sulibhavi, Sunil V. G, Shankru Guggari, Akshay R. Shanbhag
{"title":"Optimisation of deep neural network model using Reptile meta learning approach","authors":"Uday Kulkarni, Meena S M, Raghavendra A Hallyal, Prasanna H Sulibhavi, Sunil V. G, Shankru Guggari, Akshay R. Shanbhag","doi":"10.1049/ccs2.12096","DOIUrl":null,"url":null,"abstract":"The artificial intelligence (AI) within the last decade has experienced a rapid development and has attained power to simulate human‐thinking in various situations. When the deep neural networks (DNNs) are trained with huge dataset and high computational resources it can bring out great outcomes. But the learning process of DNN is very much complicated and time‐consuming. In various circumstances, where there is a data‐scarcity, the algorithms are not capable of learning tasks at a faster rate and perform nearer to that of human intelligence. With advancements in deep meta‐learning in several research studies, this problem has been dealt. Meta‐learning has outspread range of applications where the meta‐data (data about data) of the either tasks, data or the models which were previously trained can be employed to optimise the learning. So in order to get an insight of all existing meta‐learning approaches for DNN model optimisation, the authors performed survey introducing different meta‐learning techniques and also the current optimisation‐based approaches, their merits and open challenges. In this research, the Reptile meta‐learning algorithm was chosen for the experiment. As Reptile uses first‐order derivatives during optimisation process, hence making it feasible to solve optimisation problems. The authors achieved a 5% increase in accuracy with the proposed version of Reptile meta‐learning algorithm.","PeriodicalId":33652,"journal":{"name":"Cognitive Computation and Systems","volume":null,"pages":null},"PeriodicalIF":1.2000,"publicationDate":"2023-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Computation and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1049/ccs2.12096","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The artificial intelligence (AI) within the last decade has experienced a rapid development and has attained power to simulate human‐thinking in various situations. When the deep neural networks (DNNs) are trained with huge dataset and high computational resources it can bring out great outcomes. But the learning process of DNN is very much complicated and time‐consuming. In various circumstances, where there is a data‐scarcity, the algorithms are not capable of learning tasks at a faster rate and perform nearer to that of human intelligence. With advancements in deep meta‐learning in several research studies, this problem has been dealt. Meta‐learning has outspread range of applications where the meta‐data (data about data) of the either tasks, data or the models which were previously trained can be employed to optimise the learning. So in order to get an insight of all existing meta‐learning approaches for DNN model optimisation, the authors performed survey introducing different meta‐learning techniques and also the current optimisation‐based approaches, their merits and open challenges. In this research, the Reptile meta‐learning algorithm was chosen for the experiment. As Reptile uses first‐order derivatives during optimisation process, hence making it feasible to solve optimisation problems. The authors achieved a 5% increase in accuracy with the proposed version of Reptile meta‐learning algorithm.