Challenges in Translating Research to Practice for Evaluating Fairness and Bias in Recommendation Systems

Lex Beattie, D. Taber, H. Cramer
{"title":"Challenges in Translating Research to Practice for Evaluating Fairness and Bias in Recommendation Systems","authors":"Lex Beattie, D. Taber, H. Cramer","doi":"10.1145/3523227.3547403","DOIUrl":null,"url":null,"abstract":"Calls to action to implement evaluation of fairness and bias into industry systems are increasing at a rapid rate. The research community has attempted to meet these demands by producing ethical principles and guidelines for AI, but few of these documents provide guidance on how to implement these principles in real world settings. Without readily available standardized and practice-tested approaches for evaluating fairness in recommendation systems, industry practitioners, who are often not experts, may easily run into challenges or implement metrics that are potentially poorly suited to their specific applications. When evaluating recommendations, practitioners are well aware they should evaluate their systems for unintended algorithmic harms, but the most important, and unanswered question, is how? In this talk, we will present practical challenges we encountered in addressing algorithmic responsibility in recommendation systems, which also present research opportunities for the RecSys community. This talk will focus on the steps that need to happen before bias mitigation can even begin.","PeriodicalId":443279,"journal":{"name":"Proceedings of the 16th ACM Conference on Recommender Systems","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 16th ACM Conference on Recommender Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3523227.3547403","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Calls to action to implement evaluation of fairness and bias into industry systems are increasing at a rapid rate. The research community has attempted to meet these demands by producing ethical principles and guidelines for AI, but few of these documents provide guidance on how to implement these principles in real world settings. Without readily available standardized and practice-tested approaches for evaluating fairness in recommendation systems, industry practitioners, who are often not experts, may easily run into challenges or implement metrics that are potentially poorly suited to their specific applications. When evaluating recommendations, practitioners are well aware they should evaluate their systems for unintended algorithmic harms, but the most important, and unanswered question, is how? In this talk, we will present practical challenges we encountered in addressing algorithmic responsibility in recommendation systems, which also present research opportunities for the RecSys community. This talk will focus on the steps that need to happen before bias mitigation can even begin.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
将研究成果转化为评估推荐系统公平性和偏见的实践挑战
呼吁采取行动,在工业系统中实施公平和偏见评估的呼声正在迅速增加。研究界试图通过制定人工智能的伦理原则和指导方针来满足这些需求,但这些文件很少提供如何在现实世界环境中实施这些原则的指导。如果没有现成的标准化和经过实践测试的方法来评估推荐系统的公平性,行业从业者(通常不是专家)可能很容易遇到挑战,或者实施可能不适合其特定应用程序的指标。在评估建议时,从业者很清楚他们应该评估他们的系统是否有意想不到的算法危害,但最重要的问题是,如何评估?在这次演讲中,我们将介绍我们在解决推荐系统中的算法责任时遇到的实际挑战,这也为RecSys社区提供了研究机会。本次演讲将重点讨论在开始减轻偏见之前需要采取的步骤。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Heterogeneous Graph Representation Learning for multi-target Cross-Domain Recommendation Imbalanced Data Sparsity as a Source of Unfair Bias in Collaborative Filtering Position Awareness Modeling with Knowledge Distillation for CTR Prediction Multi-Modal Dialog State Tracking for Interactive Fashion Recommendation Denoising Self-Attentive Sequential Recommendation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1