Legal AI from a Privacy Point of View: Data Protection and Transparency in Focus

Cecilia Magnusson Sjöberg
{"title":"Legal AI from a Privacy Point of View: Data Protection and Transparency in Focus","authors":"Cecilia Magnusson Sjöberg","doi":"10.16993/BBK.H","DOIUrl":null,"url":null,"abstract":"A major starting point is that transparency is a condition for privacy in the context of personal data processing, especially when based on artificial intelligence (AI) methods. A major keyword here is openness, which however is not equivalent to transparency. This is explained by the fact that an organization may very well be governed by principles of openness but still not provide transparency due to insufficient access rights and lacking implementation of those rights. Given these hypotheses, the chapter investigates and illuminates ways forward in recognition of algorithms, machine learning, and big data as critical success factors of personal data processing based on AI—that is, if privacy is to be preserved. In these circumstances, autonomy of technology calls for attention and needs to be challenged from a variety of perspectives. Not least, a legal approach to digital human sciences appears to be a resource to examine further. This applies, for instance, when data subjects in the public as well as in the private sphere are exposed to AI for better or for worse. Providing what may be referred to as a legal shield between user and application might be one remedy to shortcomings in this context.","PeriodicalId":332163,"journal":{"name":"Digital Human Sciences: New Objects – New Approaches","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Digital Human Sciences: New Objects – New Approaches","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.16993/BBK.H","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

A major starting point is that transparency is a condition for privacy in the context of personal data processing, especially when based on artificial intelligence (AI) methods. A major keyword here is openness, which however is not equivalent to transparency. This is explained by the fact that an organization may very well be governed by principles of openness but still not provide transparency due to insufficient access rights and lacking implementation of those rights. Given these hypotheses, the chapter investigates and illuminates ways forward in recognition of algorithms, machine learning, and big data as critical success factors of personal data processing based on AI—that is, if privacy is to be preserved. In these circumstances, autonomy of technology calls for attention and needs to be challenged from a variety of perspectives. Not least, a legal approach to digital human sciences appears to be a resource to examine further. This applies, for instance, when data subjects in the public as well as in the private sphere are exposed to AI for better or for worse. Providing what may be referred to as a legal shield between user and application might be one remedy to shortcomings in this context.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
从隐私角度看法律人工智能:数据保护和透明度的焦点
一个主要的出发点是,在个人数据处理的背景下,透明度是隐私的一个条件,尤其是在基于人工智能(AI)方法的情况下。这里的关键词是开放,但这并不等同于透明。这可以用这样一个事实来解释,即一个组织可能很好地遵循开放原则,但由于访问权利不足和缺乏对这些权利的实施,仍然无法提供透明度。鉴于这些假设,本章调查并阐明了识别算法、机器学习和大数据的方法,这些方法是基于人工智能的个人数据处理的关键成功因素——也就是说,如果要保护隐私。在这种情况下,技术的自主性需要引起人们的注意,需要从各种角度对其提出挑战。尤其重要的是,数字人文科学的法律途径似乎是进一步研究的资源。例如,当公共领域和私人领域的数据主体无论好坏都暴露于人工智能时,这一点都适用。在用户和应用程序之间提供所谓的法律屏障可能是对这方面缺陷的一种补救措施。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Revisiting the Human–Society–Technology Nexus: Intercultural Communication Studies as a Looking Glass for Scientific Self-Scrutiny in the Digital Human Sciences Mining Art History: Bulk Converting Nonstandard PDFs to Text to Determine the Frequency of Citations and Key Terms in Humanities Articles The Growing Pains of Digital Art History: Issues for the Study of Art Using Computational Methods Legal AI from a Privacy Point of View: Data Protection and Transparency in Focus Interpreting Information Visualization
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1