Li Qiwei, Shihui Zhang, Andrew Timothy Kasper, Joshua Ashkinaze, Asia A. Eaton, Sarita Schoenebeck, Eric Gilbert
{"title":"Reporting Non-Consensual Intimate Media: An Audit Study of Deepfakes","authors":"Li Qiwei, Shihui Zhang, Andrew Timothy Kasper, Joshua Ashkinaze, Asia A. Eaton, Sarita Schoenebeck, Eric Gilbert","doi":"arxiv-2409.12138","DOIUrl":null,"url":null,"abstract":"Non-consensual intimate media (NCIM) inflicts significant harm. Currently,\nvictim-survivors can use two mechanisms to report NCIM - as a non-consensual\nnudity violation or as copyright infringement. We conducted an audit study of\ntakedown speed of NCIM reported to X (formerly Twitter) of both mechanisms. We\nuploaded 50 AI-generated nude images and reported half under X's\n\"non-consensual nudity\" reporting mechanism and half under its \"copyright\ninfringement\" mechanism. The copyright condition resulted in successful image\nremoval within 25 hours for all images (100% removal rate), while\nnon-consensual nudity reports resulted in no image removal for over three weeks\n(0% removal rate). We stress the need for targeted legislation to regulate NCIM\nremoval online. We also discuss ethical considerations for auditing NCIM on\nsocial platforms.","PeriodicalId":501112,"journal":{"name":"arXiv - CS - Computers and Society","volume":"18 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computers and Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.12138","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Non-consensual intimate media (NCIM) inflicts significant harm. Currently,
victim-survivors can use two mechanisms to report NCIM - as a non-consensual
nudity violation or as copyright infringement. We conducted an audit study of
takedown speed of NCIM reported to X (formerly Twitter) of both mechanisms. We
uploaded 50 AI-generated nude images and reported half under X's
"non-consensual nudity" reporting mechanism and half under its "copyright
infringement" mechanism. The copyright condition resulted in successful image
removal within 25 hours for all images (100% removal rate), while
non-consensual nudity reports resulted in no image removal for over three weeks
(0% removal rate). We stress the need for targeted legislation to regulate NCIM
removal online. We also discuss ethical considerations for auditing NCIM on
social platforms.