{"title":"Using transfer adaptation method for dynamic features expansion in multi-label deep neural network for recommender systems","authors":"F. Abdullayeva, Suleyman Suleymanzade","doi":"10.19139/soic-2310-5070-1836","DOIUrl":null,"url":null,"abstract":"In this paper, we propose to use a convertible deep neural network (DNN) model with a transfer adaptation mechanism to deal with varying input and output numbers of neurons. The flexible DNN model serves as a multi-label classifier for the recommender system as part of the retrieval systems’ push mechanism, which learns the combination of tabular features and proposes the number of discrete offers (targets). Our retrieval system uses the transfer adaptation, mechanism, when the number of features changes, it replaces the input layer of the neural network then freezes all gradients on the following layers, trains only replaced layer, and unfreezes the entire model. The experiments show that using the transfer adaptation technique impacts stable loss decreasing and learning speed during the training process. \n \n","PeriodicalId":131002,"journal":{"name":"Statistics, Optimization & Information Computing","volume":"143 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Statistics, Optimization & Information Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.19139/soic-2310-5070-1836","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we propose to use a convertible deep neural network (DNN) model with a transfer adaptation mechanism to deal with varying input and output numbers of neurons. The flexible DNN model serves as a multi-label classifier for the recommender system as part of the retrieval systems’ push mechanism, which learns the combination of tabular features and proposes the number of discrete offers (targets). Our retrieval system uses the transfer adaptation, mechanism, when the number of features changes, it replaces the input layer of the neural network then freezes all gradients on the following layers, trains only replaced layer, and unfreezes the entire model. The experiments show that using the transfer adaptation technique impacts stable loss decreasing and learning speed during the training process.