Balancing the Tradeoff between Profit and Fairness in Rideshare Platforms during High-Demand Hours

Vedant Nanda, Pan Xu, Karthik Abinav Sankararaman, John P. Dickerson, A. Srinivasan
{"title":"Balancing the Tradeoff between Profit and Fairness in Rideshare Platforms during High-Demand Hours","authors":"Vedant Nanda, Pan Xu, Karthik Abinav Sankararaman, John P. Dickerson, A. Srinivasan","doi":"10.1145/3375627.3375818","DOIUrl":null,"url":null,"abstract":"Rideshare platforms, when assigning requests to drivers, tend to maximize profit for the system and/or minimize waiting time for riders. Such platforms can exacerbate biases that drivers may have over certain types of requests. We consider the case of peak hours when the demand for rides is more than the supply of drivers. Drivers are well aware of their advantage during the peak hours and can choose to be selective about which rides to accept. Moreover, if in such a scenario, the assignment of requests to drivers (by the platform) is made only to maximize profit and/or minimize wait time for riders, requests of a certain type (e.g., from a non-popular pickup location, or to a non-popular drop-off location) might never be assigned to a driver. Such a system can be highly unfair to riders. However, increasing fairness might come at a cost of the overall profit made by the rideshare platform. To balance these conflicting goals, we present a flexible, non-adaptive algorithm, NAdap, that allows the platform designer to control the profit and fairness of the system via parameters α and β respectively.We model the matching problem as an online bipartite matching where the set of drivers is offline and requests arrive online. Upon the arrival of a request, we use NAdap to assign it to a driver (the driver might then choose to accept or reject it) or reject the request. We formalize the measures of profit and fairness in our setting and show that by using NAdap, the competitive ratios for profit and fairness measures would be no worse than α/e and β/e respectively. Extensive experimental results on both real-world and synthetic datasets confirm the validity of our theoretical lower bounds. Additionally, they show that NAdap under some choice of (α, β) can beat two natural heuristics, Greedy and Uniform, on both fairness and profit. Code is available at: https://github.com/nvedant07/rideshare-fairness-peak/. Full paper can be found in the proceedings of AAAI 2020 and on ArXiv: http://arxiv.org/abs/1912.08388).","PeriodicalId":93612,"journal":{"name":"Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society","volume":"24 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2019-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"43","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3375627.3375818","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 43

Abstract

Rideshare platforms, when assigning requests to drivers, tend to maximize profit for the system and/or minimize waiting time for riders. Such platforms can exacerbate biases that drivers may have over certain types of requests. We consider the case of peak hours when the demand for rides is more than the supply of drivers. Drivers are well aware of their advantage during the peak hours and can choose to be selective about which rides to accept. Moreover, if in such a scenario, the assignment of requests to drivers (by the platform) is made only to maximize profit and/or minimize wait time for riders, requests of a certain type (e.g., from a non-popular pickup location, or to a non-popular drop-off location) might never be assigned to a driver. Such a system can be highly unfair to riders. However, increasing fairness might come at a cost of the overall profit made by the rideshare platform. To balance these conflicting goals, we present a flexible, non-adaptive algorithm, NAdap, that allows the platform designer to control the profit and fairness of the system via parameters α and β respectively.We model the matching problem as an online bipartite matching where the set of drivers is offline and requests arrive online. Upon the arrival of a request, we use NAdap to assign it to a driver (the driver might then choose to accept or reject it) or reject the request. We formalize the measures of profit and fairness in our setting and show that by using NAdap, the competitive ratios for profit and fairness measures would be no worse than α/e and β/e respectively. Extensive experimental results on both real-world and synthetic datasets confirm the validity of our theoretical lower bounds. Additionally, they show that NAdap under some choice of (α, β) can beat two natural heuristics, Greedy and Uniform, on both fairness and profit. Code is available at: https://github.com/nvedant07/rideshare-fairness-peak/. Full paper can be found in the proceedings of AAAI 2020 and on ArXiv: http://arxiv.org/abs/1912.08388).
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
在高需求时段平衡拼车平台的利润和公平之间的权衡
拼车平台在将请求分配给司机时,往往会最大化系统的利润,并/或最小化乘客的等待时间。这样的平台可能会加剧司机对某些类型请求的偏见。我们考虑的是出行需求大于司机供给的高峰时段情况。司机们很清楚自己在高峰时段的优势,可以选择有选择性地接受哪辆车。此外,如果在这种情况下,(由平台)分配给司机的请求只是为了最大化利润和/或最小化乘客的等待时间,那么特定类型的请求(例如,从一个不受欢迎的上车地点,或到一个不受欢迎的下车地点)可能永远不会分配给司机。这样的系统对乘客是非常不公平的。然而,提高公平性可能会以拼车平台的整体利润为代价。为了平衡这些相互冲突的目标,我们提出了一种灵活的非自适应算法NAdap,该算法允许平台设计者分别通过参数α和β来控制系统的利润和公平性。我们将匹配问题建模为在线二部匹配,其中驱动集离线而请求到达在线。在请求到达时,我们使用NAdap将其分配给驱动程序(然后驱动程序可以选择接受或拒绝它)或拒绝请求。我们在我们的设置中形式化了利润和公平的度量,并表明通过使用NAdap,利润和公平度量的竞争比率将分别不低于α/e和β/e。在真实世界和合成数据集上的大量实验结果证实了我们理论下限的有效性。此外,他们还表明,在(α, β)的某些选择下,NAdap可以在公平和利润上击败两种自然启发式,即贪婪和均匀。代码可从https://github.com/nvedant07/rideshare-fairness-peak/获得。全文可在AAAI 2020会议纪要和ArXiv上找到:http://arxiv.org/abs/1912.08388)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Bias in Artificial Intelligence Models in Financial Services Privacy Preserving Machine Learning Systems AIES '22: AAAI/ACM Conference on AI, Ethics, and Society, Oxford, United Kingdom, May 19 - 21, 2021 To Scale: The Universalist and Imperialist Narrative of Big Tech AIES '21: AAAI/ACM Conference on AI, Ethics, and Society, Virtual Event, USA, May 19-21, 2021
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1