{"title":"健康不公平和数字时代歧视的人工智能循环:通过欧盟医疗设备监管框架减少偏见","authors":"Hannah van Kolfschooten","doi":"10.1093/jlb/lsad031","DOIUrl":null,"url":null,"abstract":"Abstract The use of Artificial Intelligence (AI) medical devices is rapidly growing. Although AI may benefit the quality and safety of healthcare for older adults, it simultaneously introduces new ethical and legal issues. Many AI medical devices exhibit age-related biases. The first part of this paper explains how ‘digital ageism’ is produced throughout the entire lifecycle of medical AI and may lead to health inequity for older people: systemic, avoidable differences in the health status of different population groups. This paper takes digital ageism as a use case to show the potential inequitable effects of AI, conceptualized as the ‘AI cycle of health inequity’. The second part of this paper explores how the European Union (EU) regulatory framework addresses the issue of digital ageism. It argues that the negative effects of age-related bias in AI medical devices are insufficiently recognized within the regulatory framework of the EU Medical Devices Regulation and the new AI Act. It concludes that while the EU framework does address some of the key issues related to technical biases in AI medical devices by stipulating rules for performance and data quality, it does not account for contextual biases, therefore neglecting part of the AI cycle of health inequity.","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":"40 15","pages":""},"PeriodicalIF":4.6000,"publicationDate":"2023-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The AI cycle of health inequity and digital ageism: mitigating biases through the EU regulatory framework on medical devices\",\"authors\":\"Hannah van Kolfschooten\",\"doi\":\"10.1093/jlb/lsad031\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract The use of Artificial Intelligence (AI) medical devices is rapidly growing. Although AI may benefit the quality and safety of healthcare for older adults, it simultaneously introduces new ethical and legal issues. Many AI medical devices exhibit age-related biases. The first part of this paper explains how ‘digital ageism’ is produced throughout the entire lifecycle of medical AI and may lead to health inequity for older people: systemic, avoidable differences in the health status of different population groups. This paper takes digital ageism as a use case to show the potential inequitable effects of AI, conceptualized as the ‘AI cycle of health inequity’. The second part of this paper explores how the European Union (EU) regulatory framework addresses the issue of digital ageism. It argues that the negative effects of age-related bias in AI medical devices are insufficiently recognized within the regulatory framework of the EU Medical Devices Regulation and the new AI Act. It concludes that while the EU framework does address some of the key issues related to technical biases in AI medical devices by stipulating rules for performance and data quality, it does not account for contextual biases, therefore neglecting part of the AI cycle of health inequity.\",\"PeriodicalId\":2,\"journal\":{\"name\":\"ACS Applied Bio Materials\",\"volume\":\"40 15\",\"pages\":\"\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2023-12-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACS Applied Bio Materials\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1093/jlb/lsad031\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATERIALS SCIENCE, BIOMATERIALS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1093/jlb/lsad031","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
The AI cycle of health inequity and digital ageism: mitigating biases through the EU regulatory framework on medical devices
Abstract The use of Artificial Intelligence (AI) medical devices is rapidly growing. Although AI may benefit the quality and safety of healthcare for older adults, it simultaneously introduces new ethical and legal issues. Many AI medical devices exhibit age-related biases. The first part of this paper explains how ‘digital ageism’ is produced throughout the entire lifecycle of medical AI and may lead to health inequity for older people: systemic, avoidable differences in the health status of different population groups. This paper takes digital ageism as a use case to show the potential inequitable effects of AI, conceptualized as the ‘AI cycle of health inequity’. The second part of this paper explores how the European Union (EU) regulatory framework addresses the issue of digital ageism. It argues that the negative effects of age-related bias in AI medical devices are insufficiently recognized within the regulatory framework of the EU Medical Devices Regulation and the new AI Act. It concludes that while the EU framework does address some of the key issues related to technical biases in AI medical devices by stipulating rules for performance and data quality, it does not account for contextual biases, therefore neglecting part of the AI cycle of health inequity.