DeepSet: Deep Learning-based Recommendation with Setwise Preference

Lin Li, Weike Pan, Guanliang Chen, Zhong Ming
{"title":"DeepSet: Deep Learning-based Recommendation with Setwise Preference","authors":"Lin Li, Weike Pan, Guanliang Chen, Zhong Ming","doi":"10.1109/IJCNN55064.2022.9892627","DOIUrl":null,"url":null,"abstract":"Recommendation methods based on deep learning frameworks have drastically increased over recent years, covering virtually all the sub-topics in recommender systems. Among these topics, one-class collaborative filtering (OCCF) as a fundamental problem has been studied most extensively. However, most of existing deep learning-based OCCF methods are essentially focused on either defining new prediction rules by replacing conventional shallow and linear inner products with a variety of neural architectures, or learning more expressive user and item factors with neural networks, which may still suffer from the inferior recommendation performance due to the underlying preference assumptions typically defined on single items. In this paper, we propose to address the limitation and justify the capacity of deep learning-based recommendation methods by adapting the setwise preference to the underlying assumption during the model learning process. Specifically, we propose a new setwise preference assumption under the neural recommendation frameworks and devise a general solution named DeepSet, which aims to enhance the learning abilities of neural collaborative filtering methods by activating the setwise preference at different neural layers, namely 1) the feature input layer, 2) the feature output layer, and 3) the prediction layer. Extensive experiments on four commonly used datasets show that our solution can effectively boost the performance of existing deep learning based methods without introducing any new model parameters.","PeriodicalId":106974,"journal":{"name":"2022 International Joint Conference on Neural Networks (IJCNN)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN55064.2022.9892627","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Recommendation methods based on deep learning frameworks have drastically increased over recent years, covering virtually all the sub-topics in recommender systems. Among these topics, one-class collaborative filtering (OCCF) as a fundamental problem has been studied most extensively. However, most of existing deep learning-based OCCF methods are essentially focused on either defining new prediction rules by replacing conventional shallow and linear inner products with a variety of neural architectures, or learning more expressive user and item factors with neural networks, which may still suffer from the inferior recommendation performance due to the underlying preference assumptions typically defined on single items. In this paper, we propose to address the limitation and justify the capacity of deep learning-based recommendation methods by adapting the setwise preference to the underlying assumption during the model learning process. Specifically, we propose a new setwise preference assumption under the neural recommendation frameworks and devise a general solution named DeepSet, which aims to enhance the learning abilities of neural collaborative filtering methods by activating the setwise preference at different neural layers, namely 1) the feature input layer, 2) the feature output layer, and 3) the prediction layer. Extensive experiments on four commonly used datasets show that our solution can effectively boost the performance of existing deep learning based methods without introducing any new model parameters.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
DeepSet:基于深度学习的推荐和设置偏好
近年来,基于深度学习框架的推荐方法急剧增加,几乎涵盖了推荐系统中的所有子主题。其中,一类协同过滤(OCCF)作为一个基本问题得到了最广泛的研究。然而,大多数现有的基于深度学习的OCCF方法本质上都集中在通过用各种神经结构取代传统的浅线性内积来定义新的预测规则,或者用神经网络学习更具表现力的用户和项目因素,由于通常在单个项目上定义的潜在偏好假设,这些方法仍然可能受到较差的推荐性能的影响。在本文中,我们提出通过在模型学习过程中将设置偏好与潜在假设相适应来解决基于深度学习的推荐方法的局限性并证明其能力。具体而言,我们在神经推荐框架下提出了一个新的集合偏好假设,并设计了一个通用的解决方案DeepSet,旨在通过激活不同神经层的集合偏好来增强神经协同过滤方法的学习能力,即1)特征输入层,2)特征输出层和3)预测层。在四个常用数据集上的大量实验表明,我们的解决方案可以有效地提高现有基于深度学习的方法的性能,而无需引入任何新的模型参数。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Parameterization of Vector Symbolic Approach for Sequence Encoding Based Visual Place Recognition Nested compression of convolutional neural networks with Tucker-2 decomposition SQL-Rank++: A Novel Listwise Approach for Collaborative Ranking with Implicit Feedback ACTSS: Input Detection Defense against Backdoor Attacks via Activation Subset Scanning ADV-ResNet: Residual Network with Controlled Adversarial Regularization for Effective Classification of Practical Time Series Under Training Data Scarcity Problem
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1