{"title":"How to deal with an AI near-miss: Look to the skies","authors":"Kris Shrishak","doi":"10.1080/00963402.2023.2199580","DOIUrl":null,"url":null,"abstract":"ABSTRACT AI systems are harming people. Harms such as discrimination and manipulation are reported in the media, which is the primary source of information on AI incidents. Reporting AI near-misses and learning from how a serious incident was prevented would help avoid future incidents. The problem is that ongoing efforts to catalog AI incidents rely on media reports—which does not prevent incidents. Developers, designers, and deployers of AI systems should be incentivized to report and share information on near misses. Such an AI near-miss reporting system does not have to be designed from scratch; the aviation industry’s voluntary, confidential, and non-punitive approach to such reporting can be used as a guide. AI incidents are accumulating, and the sooner such a near-miss reporting system is established, the better.","PeriodicalId":46802,"journal":{"name":"Bulletin of the Atomic Scientists","volume":"79 1","pages":"166 - 169"},"PeriodicalIF":1.9000,"publicationDate":"2023-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Bulletin of the Atomic Scientists","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1080/00963402.2023.2199580","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"INTERNATIONAL RELATIONS","Score":null,"Total":0}
引用次数: 1
Abstract
ABSTRACT AI systems are harming people. Harms such as discrimination and manipulation are reported in the media, which is the primary source of information on AI incidents. Reporting AI near-misses and learning from how a serious incident was prevented would help avoid future incidents. The problem is that ongoing efforts to catalog AI incidents rely on media reports—which does not prevent incidents. Developers, designers, and deployers of AI systems should be incentivized to report and share information on near misses. Such an AI near-miss reporting system does not have to be designed from scratch; the aviation industry’s voluntary, confidential, and non-punitive approach to such reporting can be used as a guide. AI incidents are accumulating, and the sooner such a near-miss reporting system is established, the better.