{"title":"信息自由制度下的算法政府:公共教育部门ADM系统的个案分析","authors":"María Estrella Gutiérrez David","doi":"10.2979/gls.2023.a886165","DOIUrl":null,"url":null,"abstract":"Abstract: What the Houston Court qualified as \"mysterious 'black box' impervious to challenge\" was in practice a sophisticated software of many layers of calculations, which rated teachers' effectiveness to make employment decisions. In the European Union, a system as such would fall under the Proposal for AI Regulation of 2021, which qualifies AI models in education and vocational training as \"high-risk\" systems. Automated decision-making systems (ADM systems), AI-driven or not, are being increasingly used by governments in public education for different purposes, such as handling applications for undergraduate admission or profiling students and teachers to assess their performance. Across cases and jurisdictions, there is growing evidence of how the use of ADM systems in the education sector is becoming quite problematic: arbitrary assignment of teaching posts in mobility procedures, undue barriers to access undergraduate studies, and frequent lack of transparency in their implementation and decisions. This Article discusses how Freedom of Information Act (FOIA) regimes may contribute to rendering governments' ADM systems (AI-driven or not) accountable. The analysis of the FOIA cases (Parcoursoup saga in France, MIUR in Italy, and Ofqual in the United Kingdom) shows to what extent decisions granting access to the source code, functional and technical specifications, or third-party audits allow public scrutiny of ADM systems, detection of their pathologies, and better understanding of their adverse impacts on rights and freedoms, individual or collective. This Article also addresses the constitutional value of the right of access to public records (Parcoursup), and the importance of proactive and mandatory public dissemination to ensure traceability, transparency, and accountability of the ADM systems for FOIA purposes. In this sense, some legal initiatives across jurisdictions (Canada, France, Spain, United States, European Union) enhancing transparency and accountability of algorithmic systems will be examined.","PeriodicalId":39188,"journal":{"name":"Indiana Journal of Global Legal Studies","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Government by Algorithms at the Light of Freedom of Information Regimes: A Case-by-Case Approach on ADM Systems within Public Education Sector\",\"authors\":\"María Estrella Gutiérrez David\",\"doi\":\"10.2979/gls.2023.a886165\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract: What the Houston Court qualified as \\\"mysterious 'black box' impervious to challenge\\\" was in practice a sophisticated software of many layers of calculations, which rated teachers' effectiveness to make employment decisions. In the European Union, a system as such would fall under the Proposal for AI Regulation of 2021, which qualifies AI models in education and vocational training as \\\"high-risk\\\" systems. Automated decision-making systems (ADM systems), AI-driven or not, are being increasingly used by governments in public education for different purposes, such as handling applications for undergraduate admission or profiling students and teachers to assess their performance. Across cases and jurisdictions, there is growing evidence of how the use of ADM systems in the education sector is becoming quite problematic: arbitrary assignment of teaching posts in mobility procedures, undue barriers to access undergraduate studies, and frequent lack of transparency in their implementation and decisions. This Article discusses how Freedom of Information Act (FOIA) regimes may contribute to rendering governments' ADM systems (AI-driven or not) accountable. The analysis of the FOIA cases (Parcoursoup saga in France, MIUR in Italy, and Ofqual in the United Kingdom) shows to what extent decisions granting access to the source code, functional and technical specifications, or third-party audits allow public scrutiny of ADM systems, detection of their pathologies, and better understanding of their adverse impacts on rights and freedoms, individual or collective. This Article also addresses the constitutional value of the right of access to public records (Parcoursup), and the importance of proactive and mandatory public dissemination to ensure traceability, transparency, and accountability of the ADM systems for FOIA purposes. In this sense, some legal initiatives across jurisdictions (Canada, France, Spain, United States, European Union) enhancing transparency and accountability of algorithmic systems will be examined.\",\"PeriodicalId\":39188,\"journal\":{\"name\":\"Indiana Journal of Global Legal Studies\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Indiana Journal of Global Legal Studies\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2979/gls.2023.a886165\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"Social Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Indiana Journal of Global Legal Studies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2979/gls.2023.a886165","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Social Sciences","Score":null,"Total":0}
Government by Algorithms at the Light of Freedom of Information Regimes: A Case-by-Case Approach on ADM Systems within Public Education Sector
Abstract: What the Houston Court qualified as "mysterious 'black box' impervious to challenge" was in practice a sophisticated software of many layers of calculations, which rated teachers' effectiveness to make employment decisions. In the European Union, a system as such would fall under the Proposal for AI Regulation of 2021, which qualifies AI models in education and vocational training as "high-risk" systems. Automated decision-making systems (ADM systems), AI-driven or not, are being increasingly used by governments in public education for different purposes, such as handling applications for undergraduate admission or profiling students and teachers to assess their performance. Across cases and jurisdictions, there is growing evidence of how the use of ADM systems in the education sector is becoming quite problematic: arbitrary assignment of teaching posts in mobility procedures, undue barriers to access undergraduate studies, and frequent lack of transparency in their implementation and decisions. This Article discusses how Freedom of Information Act (FOIA) regimes may contribute to rendering governments' ADM systems (AI-driven or not) accountable. The analysis of the FOIA cases (Parcoursoup saga in France, MIUR in Italy, and Ofqual in the United Kingdom) shows to what extent decisions granting access to the source code, functional and technical specifications, or third-party audits allow public scrutiny of ADM systems, detection of their pathologies, and better understanding of their adverse impacts on rights and freedoms, individual or collective. This Article also addresses the constitutional value of the right of access to public records (Parcoursup), and the importance of proactive and mandatory public dissemination to ensure traceability, transparency, and accountability of the ADM systems for FOIA purposes. In this sense, some legal initiatives across jurisdictions (Canada, France, Spain, United States, European Union) enhancing transparency and accountability of algorithmic systems will be examined.