Vaidyanath Areyur Shanthakumar, Clark Barnett, Keith Warnick, P. A. Sudyanti, Vitalii Gerbuz, Tathagata Mukherjee
{"title":"Item based recommendation using matrix-factorization-like embeddings from deep networks","authors":"Vaidyanath Areyur Shanthakumar, Clark Barnett, Keith Warnick, P. A. Sudyanti, Vitalii Gerbuz, Tathagata Mukherjee","doi":"10.1145/3409334.3452041","DOIUrl":null,"url":null,"abstract":"In this paper we describe a method for computing item based recommendations using matrix-factorization-like embeddings of the items computed using a neural network. Matrix factorizations (MF) compute near optimal item embeddings by minimizing a loss that measures the discrepancy between the predicted and known values of a sparse user-item rating matrix. Though useful for recommendation tasks, they are computationally intensive and hard to compute for large sets of users and items. Hence there is need to compute MF-like embeddings using other less computationally intensive methods, which can be substituted for the actual ones. In this work we explore the possibility of doing the same using a deep neural network (DNN). Our network is trained to learn matrix-factorization-like embeddings from easy to compute natural language processing (NLP) based semantic embeddings. The resulting MF-like embeddings are used to compute recommendations using an anonymized user product engagement dataset from the online retail company Overstock.com. We present the results of using our embeddings for computing recommendations with the Overstock.com production dataset consisting of ~3.5 million items and ~6 million users. Recommendations from Overstock.com's own recommendation system is compared against those obtained by using our MF-like embeddings, by comparing the results from both to the ground truth, which in our case is actual user co-clicks data. Our results show that it is possible to use DNNs for efficiently computing MF-like embeddings which can then be used in conjunction with the NLP based embeddings to improve the recommendations obtained from the NLP based embeddings.","PeriodicalId":148741,"journal":{"name":"Proceedings of the 2021 ACM Southeast Conference","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2021 ACM Southeast Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3409334.3452041","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In this paper we describe a method for computing item based recommendations using matrix-factorization-like embeddings of the items computed using a neural network. Matrix factorizations (MF) compute near optimal item embeddings by minimizing a loss that measures the discrepancy between the predicted and known values of a sparse user-item rating matrix. Though useful for recommendation tasks, they are computationally intensive and hard to compute for large sets of users and items. Hence there is need to compute MF-like embeddings using other less computationally intensive methods, which can be substituted for the actual ones. In this work we explore the possibility of doing the same using a deep neural network (DNN). Our network is trained to learn matrix-factorization-like embeddings from easy to compute natural language processing (NLP) based semantic embeddings. The resulting MF-like embeddings are used to compute recommendations using an anonymized user product engagement dataset from the online retail company Overstock.com. We present the results of using our embeddings for computing recommendations with the Overstock.com production dataset consisting of ~3.5 million items and ~6 million users. Recommendations from Overstock.com's own recommendation system is compared against those obtained by using our MF-like embeddings, by comparing the results from both to the ground truth, which in our case is actual user co-clicks data. Our results show that it is possible to use DNNs for efficiently computing MF-like embeddings which can then be used in conjunction with the NLP based embeddings to improve the recommendations obtained from the NLP based embeddings.