{"title":"Weight Initialization based Partial Training Algorithm for Fast Learning in Neural Network","authors":"Jung-Jae Kim, Min-Woo Ryu, S. Cha, Kuk-Hyun Cho","doi":"10.14257/ijdta.2017.10.8.03","DOIUrl":null,"url":null,"abstract":"The classification problem is one of most important problems in Artificial Intelligence (AI) Research. Classification is used in various fields such as speech recognition, image classification, word prediction in text. Deep Neural Network (DNN) is the most commonly used for the classification. However, DNN requires a lot of learning time because of its deep network structure and lots of data. At this time, if a new feature or a new category class (new data) is added, the existing data on which learning has been completed is also re-learned. And the same learning time (very long time) as the previous learning time is needed. Therefore, in this paper, we proposes Weight Initialization-based Partial Training (WIPT) algorithm, that decompose the existing weight matrix through Singular Value Decomposition (SVD) and generate a latent matrix with information learned by the existing model. In order to increase the learning efficiency, we use a strategy of learning new features or classes by initializing newly added weights to appropriate values. Finally we verify the efficiency of the proposed algorithm by comparing it with the existing whole learning.","PeriodicalId":13926,"journal":{"name":"International journal of database theory and application","volume":"75 1","pages":"21-30"},"PeriodicalIF":0.0000,"publicationDate":"2017-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of database theory and application","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.14257/ijdta.2017.10.8.03","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The classification problem is one of most important problems in Artificial Intelligence (AI) Research. Classification is used in various fields such as speech recognition, image classification, word prediction in text. Deep Neural Network (DNN) is the most commonly used for the classification. However, DNN requires a lot of learning time because of its deep network structure and lots of data. At this time, if a new feature or a new category class (new data) is added, the existing data on which learning has been completed is also re-learned. And the same learning time (very long time) as the previous learning time is needed. Therefore, in this paper, we proposes Weight Initialization-based Partial Training (WIPT) algorithm, that decompose the existing weight matrix through Singular Value Decomposition (SVD) and generate a latent matrix with information learned by the existing model. In order to increase the learning efficiency, we use a strategy of learning new features or classes by initializing newly added weights to appropriate values. Finally we verify the efficiency of the proposed algorithm by comparing it with the existing whole learning.