Bi-directional Encoder Representation of Transformer model for Sequential Music Recommender System

Naina Yadav, Anil Kumar Singh
{"title":"Bi-directional Encoder Representation of Transformer model for Sequential Music Recommender System","authors":"Naina Yadav, Anil Kumar Singh","doi":"10.1145/3441501.3441503","DOIUrl":null,"url":null,"abstract":"A recommendation system is a set of programs that utilize different methodologies for relevant item selection for the user. In recent years deep neural networks have been used heavily for improving recommendation quality in every domain. We describe a model for music recommendation system that uses the BERT (Bidirectional Encoder Representations from Transformers) model. In the past, other deep neural networks have been used for music recommendation, which capture the the unidirectional sequential nature of a user’s data. Unlike other sequential techniques of recommendation, BERT uses bidirectional training of a user’s sequence for better recommendation. BERT uses the encoder part of the Transformer model, which uses an attention mechanism to learn contextual relations between a user’s past interactions. The proposed model relies on a user’s previous interaction to determine the bidirectional encoding for the model, which considers both the left and the right contexts. We evaluated our model with a baseline deep sequential model using two different datasets, and comparative results show that the model outperforms other sequential models.","PeriodicalId":415985,"journal":{"name":"Proceedings of the 12th Annual Meeting of the Forum for Information Retrieval Evaluation","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 12th Annual Meeting of the Forum for Information Retrieval Evaluation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3441501.3441503","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

A recommendation system is a set of programs that utilize different methodologies for relevant item selection for the user. In recent years deep neural networks have been used heavily for improving recommendation quality in every domain. We describe a model for music recommendation system that uses the BERT (Bidirectional Encoder Representations from Transformers) model. In the past, other deep neural networks have been used for music recommendation, which capture the the unidirectional sequential nature of a user’s data. Unlike other sequential techniques of recommendation, BERT uses bidirectional training of a user’s sequence for better recommendation. BERT uses the encoder part of the Transformer model, which uses an attention mechanism to learn contextual relations between a user’s past interactions. The proposed model relies on a user’s previous interaction to determine the bidirectional encoding for the model, which considers both the left and the right contexts. We evaluated our model with a baseline deep sequential model using two different datasets, and comparative results show that the model outperforms other sequential models.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
顺序音乐推荐系统变压器模型的双向编码器表示
推荐系统是一组程序,它们利用不同的方法为用户选择相关的项目。近年来,深度神经网络被广泛用于提高各个领域的推荐质量。我们描述了一个使用BERT(来自变形金刚的双向编码器表示)模型的音乐推荐系统模型。在过去,其他深度神经网络已经被用于音乐推荐,它捕获了用户数据的单向顺序性质。与其他顺序推荐技术不同,BERT使用用户序列的双向训练来进行更好的推荐。BERT使用Transformer模型的编码器部分,它使用注意机制来学习用户过去交互之间的上下文关系。所提出的模型依赖于用户以前的交互来确定模型的双向编码,该模型同时考虑左上下文和右上下文。我们使用两个不同的数据集对我们的模型与基线深度序列模型进行了评估,对比结果表明该模型优于其他序列模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Bi-directional Encoder Representation of Transformer model for Sequential Music Recommender System Overview of the PAN@FIRE 2020 Task on the Authorship Identification of SOurce COde Overview of RCD-2020, the FIRE-2020 track on Retrieval from Conversational Dialogues Proceedings of the 12th Annual Meeting of the Forum for Information Retrieval Evaluation FIRE 2020 EDNIL Track: Event Detection from News in Indian Languages
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1