Sambhavi Tiwari, Manas Gogoi, S. Verma, Krishna Pratap Singh
{"title":"Meta-learning with Hopfield Neural Network","authors":"Sambhavi Tiwari, Manas Gogoi, S. Verma, Krishna Pratap Singh","doi":"10.1109/UPCON56432.2022.9986399","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a novel meta-learning method that leverages the advantages of both meta-learning and storage. In meta-learning, the neural network tries to learn parameters distributed across multiple tasks. Meta-learning provides quick learning with unseen meta-testing tasks. In model-based meta-learning methods, an external memory module is used to retain a memory of important parameters from one task to the other, enabling meta-learning. The model proposed in this work consists of a long short-term memory(LSTM) neural network with an external memory network known as Hopfield neural network. Hopfield neural network is a single-layer, non-linear, auto-associative model that uses an external memory network. Unlike previous methods, our proposed model $LSTM_{HAM}$, i.e., long short term memory with Hopfield associative memory focuses on storing knowledge that uses an additional memory network to store and retrieve patterns using different location-based access mechanisms. Our model extends the capabilities of the LSTM and performs meta-learning best on 5-way 10-shot task setting with an average accuracy of approximately 60 percent.","PeriodicalId":185782,"journal":{"name":"2022 IEEE 9th Uttar Pradesh Section International Conference on Electrical, Electronics and Computer Engineering (UPCON)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 9th Uttar Pradesh Section International Conference on Electrical, Electronics and Computer Engineering (UPCON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UPCON56432.2022.9986399","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
In this paper, we propose a novel meta-learning method that leverages the advantages of both meta-learning and storage. In meta-learning, the neural network tries to learn parameters distributed across multiple tasks. Meta-learning provides quick learning with unseen meta-testing tasks. In model-based meta-learning methods, an external memory module is used to retain a memory of important parameters from one task to the other, enabling meta-learning. The model proposed in this work consists of a long short-term memory(LSTM) neural network with an external memory network known as Hopfield neural network. Hopfield neural network is a single-layer, non-linear, auto-associative model that uses an external memory network. Unlike previous methods, our proposed model $LSTM_{HAM}$, i.e., long short term memory with Hopfield associative memory focuses on storing knowledge that uses an additional memory network to store and retrieve patterns using different location-based access mechanisms. Our model extends the capabilities of the LSTM and performs meta-learning best on 5-way 10-shot task setting with an average accuracy of approximately 60 percent.