首页 > 最新文献

Information Privacy Law eJournal最新文献

英文 中文
Policy Responses to Cross-border Central Bank Digital Currencies – Assessing the Transborder Effects of Digital Yuan 对跨境央行数字货币的政策回应——评估数字人民币的跨境效应
Pub Date : 2021-08-08 DOI: 10.2139/ssrn.3891208
Cheng-Yun Tsang, Ping-Kuei Chen
Current literature on central bank digital currencies (CBDCs) generally focuses on regulatory issues in the domestic context. This paper discusses the challenges when a CBDC circulates across national borders. It addresses three cross-border spillover effects of the CBDC: the crowding out effect on local currency; challenges to capital control for regulators; and infringement of user privacy. The paper posits the Digital Yuan as the sample on which spillover effects can be assessed as it is circulated beyond its borders. It is estimated that the major fund receivers of the Belt and Road Initiative (BRI), and China’s neighbors, are most likely to be affected by the Digital Yuan. These countries will benefit from convenient, efficient, and secure transactions as the Digital Yuan circulates. But they may face problems when the Digital Yuan becomes widely used in local markets. They will find it difficult to control or monitor the flow of the Digital Yuan, and will have to take measures to protect the privacy of their domestic users. The authors therefore propose unilateral, bilateral, and multilateral strategies to cope with the corresponding spillover effects. The paper’s analysis suggests that the adverse effects of the cross-border uses of CBDCs can be addressed and mitigated by adequate institutional design, and by multilateral coordination efforts.
目前关于中央银行数字货币(cbdc)的文献通常侧重于国内背景下的监管问题。本文讨论了CBDC跨境流通时面临的挑战。它解决了CBDC的三个跨境溢出效应:对当地货币的挤出效应;监管机构面临的资本管制挑战;以及侵犯用户隐私。本文以数字人民币为样本,可以评估其在境外流通时的溢出效应。据估计,“一带一路”倡议的主要资金接收国以及中国的邻国最有可能受到数字人民币的影响。随着数字人民币的流通,这些国家将受益于便捷、高效和安全的交易。但当数字人民币在当地市场广泛使用时,他们可能会面临问题。他们将发现难以控制或监控数字元的流动,并将不得不采取措施保护其国内用户的隐私。因此,作者提出了应对相应溢出效应的单边、双边和多边战略。该论文的分析表明,跨境使用cbdc的不利影响可以通过适当的制度设计和多边协调努力来解决和减轻。
{"title":"Policy Responses to Cross-border Central Bank Digital Currencies – Assessing the Transborder Effects of Digital Yuan","authors":"Cheng-Yun Tsang, Ping-Kuei Chen","doi":"10.2139/ssrn.3891208","DOIUrl":"https://doi.org/10.2139/ssrn.3891208","url":null,"abstract":"Current literature on central bank digital currencies (CBDCs) generally focuses on regulatory issues in the domestic context. This paper discusses the challenges when a CBDC circulates across national borders. It addresses three cross-border spillover effects of the CBDC: the crowding out effect on local currency; challenges to capital control for regulators; and infringement of user privacy. The paper posits the Digital Yuan as the sample on which spillover effects can be assessed as it is circulated beyond its borders. It is estimated that the major fund receivers of the Belt and Road Initiative (BRI), and China’s neighbors, are most likely to be affected by the Digital Yuan. These countries will benefit from convenient, efficient, and secure transactions as the Digital Yuan circulates. But they may face problems when the Digital Yuan becomes widely used in local markets. They will find it difficult to control or monitor the flow of the Digital Yuan, and will have to take measures to protect the privacy of their domestic users. The authors therefore propose unilateral, bilateral, and multilateral strategies to cope with the corresponding spillover effects. The paper’s analysis suggests that the adverse effects of the cross-border uses of CBDCs can be addressed and mitigated by adequate institutional design, and by multilateral coordination efforts.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114346644","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Artificial Intelligence in the Internet of Health Things: Is the Solution to AI Privacy More AI? 健康物联网中的人工智能:AI隐私的解决方案更AI吗?
Pub Date : 2021-05-01 DOI: 10.2139/ssrn.3838571
Liane Colonna
The emerging power of Artificial Intelligence (AI), driven by the exponential growth in computer processing and the digitization of things, has the capacity to bring unfathomable benefits to society. In particular, AI promises to reinvent modern healthcare through devices that can predict, comprehend, learn, and act in astonishing and novel ways. While AI has an enormous potential to produce societal benefits, it will not be a sustainable technology without developing solutions to safeguard privacy while processing ever-growing sets of sensitive data.

This paper considers the tension that exists between privacy and AI and examines how AI and privacy can coexist, enjoying the advantages that each can bring. Rejecting the idea that AI means the end of privacy, and taking a technoprogressive stance, the paper seeks to explore how AI can be actively used to protect individual privacy. It contributes to the literature by reconfiguring AI not as a source of threats and challenges, but rather as a phenomenon that has the potential to empower individuals to protect their privacy.

The first part of the paper sets forward a brief taxonomy of AI and clarifies its role in the Internet of Health Things (IoHT). It then addresses privacy concerns that arise in this context. Next, the paper shifts towards a discussion of Data Protection by Design, exploring how AI can be utilized to meet this standard and in turn preserve individual privacy and data protection rights in the IoHT. Finally, the paper presents a case study of how some are actively using AI to preserve privacy in the IoHT.
在计算机处理和事物数字化的指数级增长的推动下,人工智能(AI)的新兴力量有能力为社会带来难以估量的利益。特别是,人工智能有望通过能够以惊人和新颖的方式预测、理解、学习和行动的设备来重塑现代医疗保健。虽然人工智能具有产生社会效益的巨大潜力,但如果不开发解决方案来保护隐私,同时处理不断增长的敏感数据集,它将不是一种可持续的技术。本文考虑了隐私和人工智能之间存在的紧张关系,并探讨了人工智能和隐私如何共存,享受各自带来的优势。本文反对人工智能意味着隐私终结的观点,并采取技术进步的立场,试图探索如何积极利用人工智能来保护个人隐私。它通过重新配置人工智能而不是将其视为威胁和挑战的来源,而是将其视为一种有可能赋予个人保护其隐私的现象,从而为文献做出了贡献。本文第一部分简要介绍了人工智能的分类,阐明了人工智能在健康物联网(IoHT)中的作用。然后,它解决了在这种情况下出现的隐私问题。接下来,本文转向讨论通过设计来保护数据,探索如何利用人工智能来满足这一标准,从而在物联网中保护个人隐私和数据保护权。最后,本文提出了一个案例研究,说明一些人如何积极使用人工智能来保护IoHT中的隐私。
{"title":"Artificial Intelligence in the Internet of Health Things: Is the Solution to AI Privacy More AI?","authors":"Liane Colonna","doi":"10.2139/ssrn.3838571","DOIUrl":"https://doi.org/10.2139/ssrn.3838571","url":null,"abstract":"The emerging power of Artificial Intelligence (AI), driven by the exponential growth in computer processing and the digitization of things, has the capacity to bring unfathomable benefits to society. In particular, AI promises to reinvent modern healthcare through devices that can predict, comprehend, learn, and act in astonishing and novel ways. While AI has an enormous potential to produce societal benefits, it will not be a sustainable technology without developing solutions to safeguard privacy while processing ever-growing sets of sensitive data.<br><br>This paper considers the tension that exists between privacy and AI and examines how AI and privacy can coexist, enjoying the advantages that each can bring. Rejecting the idea that AI means the end of privacy, and taking a technoprogressive stance, the paper seeks to explore how AI can be actively used to protect individual privacy. It contributes to the literature by reconfiguring AI not as a source of threats and challenges, but rather as a phenomenon that has the potential to empower individuals to protect their privacy.<br><br>The first part of the paper sets forward a brief taxonomy of AI and clarifies its role in the Internet of Health Things (IoHT). It then addresses privacy concerns that arise in this context. Next, the paper shifts towards a discussion of Data Protection by Design, exploring how AI can be utilized to meet this standard and in turn preserve individual privacy and data protection rights in the IoHT. Finally, the paper presents a case study of how some are actively using AI to preserve privacy in the IoHT.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126163353","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Comments on GDPR Enforcement EDPB Decision 01/020 关于GDPR执行EDPB决定01/020的评论
Pub Date : 2021-01-10 DOI: 10.2139/ssrn.3765602
C. Hodges
The European Data Protection Board issued its first Binding Decision on 9 November 2020 in a case in which the Irish Data Commissioner (DPA) was lead enforcement authority. In the judgment of the Irish DPA, a fine of up to EUR 275,000 was appropriate, taking into account all relevant circumstances, including aggravating and mitigating factors. Several other national DPAs raised objections, including the German DPA, which thought that a fine of up to EUR 22 million was relevant, on the basis that it should be 'dissuasive' and therefore 'must be high enough to make data processing uneconomic and objectively inefficient'. Under the DGPR, the EDPB considered all objections, and rejected a surprising number as not satisfying the 'relevant and reasoned' standard. The EDPB issued a binding decision that a sanction must be 'deterrent' and required The Irish DPA to revise its fine. The Irish DPA issued a fine of EUR 450,000. This paper highlights the major rift in theory and practice between different approaches to the effects, if any, of financial sanctions. The case raises fundamental issues over the consistency and coherence of EU enforcement policy, and the level of confidence that may be placed in it. It identifies a conflict between traditional concepts of deterrence, effective, proportionate and dissuasive sanctions, and outcome-focused achievement of compliance. It also raises an underlying conflict between pure economic theory on the effectiveness of penalties and the relevance of the findings on behavioral science on how to affect future behavior.
欧洲数据保护委员会于2020年11月9日发布了首个具有约束力的决定,爱尔兰数据专员(DPA)是该案的主要执法机构。在爱尔兰DPA的判决中,考虑到所有相关情况,包括加重和减轻因素,高达27.5万欧元的罚款是适当的。其他几个国家的DPA提出了反对意见,包括德国的DPA,它认为高达2200万欧元的罚款是相关的,因为它应该是“劝阻性的”,因此“必须足够高,使数据处理不经济,客观上效率低下”。根据《发展规划条例》,市建局考虑了所有反对意见,并以不符合“相关及合理”标准为由,驳回了数目惊人的反对意见。EDPB发布了一项具有约束力的决定,即制裁必须具有“威慑力”,并要求爱尔兰DPA修改其罚款。爱尔兰DPA开出了45万欧元的罚单。本文强调了对金融制裁效果(如果有的话)的不同看法之间在理论和实践上的主要分歧。此案引发了欧盟执法政策的一致性和连贯性,以及可能对其施加的信心程度等根本性问题。报告指出,威慑、有效、相称和劝阻性制裁的传统概念与注重结果的遵守之间存在冲突。它还引发了关于惩罚有效性的纯经济学理论与如何影响未来行为的行为科学研究结果之间的潜在冲突。
{"title":"Comments on GDPR Enforcement EDPB Decision 01/020","authors":"C. Hodges","doi":"10.2139/ssrn.3765602","DOIUrl":"https://doi.org/10.2139/ssrn.3765602","url":null,"abstract":"The European Data Protection Board issued its first Binding Decision on 9 November 2020 in a case in which the Irish Data Commissioner (DPA) was lead enforcement authority. In the judgment of the Irish DPA, a fine of up to EUR 275,000 was appropriate, taking into account all relevant circumstances, including aggravating and mitigating factors. Several other national DPAs raised objections, including the German DPA, which thought that a fine of up to EUR 22 million was relevant, on the basis that it should be 'dissuasive' and therefore 'must be high enough to make data processing uneconomic and objectively inefficient'. Under the DGPR, the EDPB considered all objections, and rejected a surprising number as not satisfying the 'relevant and reasoned' standard. The EDPB issued a binding decision that a sanction must be 'deterrent' and required The Irish DPA to revise its fine. The Irish DPA issued a fine of EUR 450,000. \u0000 \u0000This paper highlights the major rift in theory and practice between different approaches to the effects, if any, of financial sanctions. The case raises fundamental issues over the consistency and coherence of EU enforcement policy, and the level of confidence that may be placed in it. It identifies a conflict between traditional concepts of deterrence, effective, proportionate and dissuasive sanctions, and outcome-focused achievement of compliance. It also raises an underlying conflict between pure economic theory on the effectiveness of penalties and the relevance of the findings on behavioral science on how to affect future behavior.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122139945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Privacy Rights and Data Security: GDPR and Personal Data Driven Markets 隐私权和数据安全:GDPR和个人数据驱动市场
Pub Date : 2020-07-05 DOI: 10.2139/ssrn.3643979
T. Ke, K. Sudhir
The paper investigates how the two key features of GDPR (EU’s data protection regulation)— privacy rights and data security—impact personal data driven markets. First, GDPR recognizes that individuals own and control their data in perpetuity, leading to three critical privacy rights: (i) right to explicit consent (data opt-in), (ii) right to be forgotten (data erasure), and (iii) right to portability (switch data to competitor). Second, GDPR has data security mandates protection against privacy breaches through unauthorized access. The right to explicit opt-in allows goods exchange without data exchange. Erasure and portability rights discipline firms to provide ongoing value and reduces consumers’ holdup using their own data. Overall, privacy rights restrict legal collection and use, while data security protects against illegal access and use. We develop a two- period model of forward-looking firms and consumers where consumers exercise data privacy rights balancing the cost (privacy breach, price discrimination) and benefits (product personalization, price subsidies) of sharing data with firms. We find that by reducing expected privacy breach costs, data security mandates increase opt-in, consumer surplus and firm profit. Privacy rights reduce opt-in and mostly increase consumer surplus at the expense of firm profits; interestingly they hurt firms more in competitive than in monopolistic markets. While privacy rights can reduce surplus for both firms and consumers, these conditions are unlikely to be realized when breach risk is endogenized. Further, by unbundling data exchange from goods exchange, privacy rights facilitate trade in goods that may otherwise fail to occur due to privacy breach risk.
本文研究了GDPR(欧盟数据保护条例)的两个关键特征——隐私权和数据安全——如何影响个人数据驱动的市场。首先,GDPR承认个人永久拥有和控制其数据,从而产生三个关键的隐私权:(i)明确同意权(数据选择加入),(ii)被遗忘权(数据删除)和(iii)可移植性权(将数据切换到竞争对手)。其次,GDPR有数据安全授权,防止未经授权的访问造成隐私泄露。明确的选择加入权允许商品交换而不交换数据。擦除和可移植性权利要求公司提供持续的价值,并减少消费者使用自己的数据的拖延。总的来说,隐私权限制合法的收集和使用,而数据安全则防止非法访问和使用。我们开发了一个前瞻性企业和消费者的两期模型,其中消费者行使数据隐私权,平衡与企业共享数据的成本(隐私泄露、价格歧视)和收益(产品个性化、价格补贴)。我们发现,通过降低预期的隐私泄露成本,数据安全授权增加了选择、消费者剩余和企业利润。隐私权减少了用户的选择,以牺牲企业利润为代价增加了消费者剩余;有趣的是,它们在竞争市场中对企业的伤害比在垄断市场中更大。虽然隐私权可以减少企业和消费者的剩余,但当违约风险内部化时,这些条件不太可能实现。此外,通过将数据交换与货物交换分离,隐私权促进了由于隐私泄露风险而无法进行的货物贸易。
{"title":"Privacy Rights and Data Security: GDPR and Personal Data Driven Markets","authors":"T. Ke, K. Sudhir","doi":"10.2139/ssrn.3643979","DOIUrl":"https://doi.org/10.2139/ssrn.3643979","url":null,"abstract":"The paper investigates how the two key features of GDPR (EU’s data protection regulation)— privacy rights and data security—impact personal data driven markets. First, GDPR recognizes that individuals own and control their data in perpetuity, leading to three critical privacy rights: (i) right to explicit consent (data opt-in), (ii) right to be forgotten (data erasure), and (iii) right to portability (switch data to competitor). Second, GDPR has data security mandates protection against privacy breaches through unauthorized access. The right to explicit opt-in allows goods exchange without data exchange. Erasure and portability rights discipline firms to provide ongoing value and reduces consumers’ holdup using their own data. Overall, privacy rights restrict legal collection and use, while data security protects against illegal access and use. We develop a two- period model of forward-looking firms and consumers where consumers exercise data privacy rights balancing the cost (privacy breach, price discrimination) and benefits (product personalization, price subsidies) of sharing data with firms. We find that by reducing expected privacy breach costs, data security mandates increase opt-in, consumer surplus and firm profit. Privacy rights reduce opt-in and mostly increase consumer surplus at the expense of firm profits; interestingly they hurt firms more in competitive than in monopolistic markets. While privacy rights can reduce surplus for both firms and consumers, these conditions are unlikely to be realized when breach risk is endogenized. Further, by unbundling data exchange from goods exchange, privacy rights facilitate trade in goods that may otherwise fail to occur due to privacy breach risk.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"4 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115402110","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Big Boss is Watching You! The Right to Privacy of Employees in the Context of Workplace Surveillance “大老板”在看着你呢!工作场所监控下雇员的隐私权
Pub Date : 2020-05-21 DOI: 10.2139/ssrn.3740078
Fidan Abdurrahimli
Workplace surveillance is a necessity which was prompt by the development of information communication technologies that offered huge opportunities to employers to monitor their employees at work and even out of work which cause strict concerns for the privacy of employees. The present thesis examines such concerns of employees arising out of workplace surveillance. The legal protection of privacy within certain systems, particularly, within the European Convention on Human Rights is considered. After examining the substantive matters of the right to respect for private life under the Convention, four cases of the European Court of Human Rights concerning employee privacy at work are studied thoroughly, and an analysis of each case is provided. By such an examination, the scope of protection of the right to privacy of employees in the context of workplace surveillance is expounded. Furthermore, certain specific problems regarding the protection of privacy are highlighted and where relevant, possible solutions are presented.
工作场所的监视是必要的,这是由信息通信技术的发展所推动的,这为雇主提供了巨大的机会来监视他们的雇员在工作中,甚至在工作之外,这引起了对雇员隐私的严格关注。本论文探讨了员工在工作场所监控中产生的这种担忧。审议了在某些制度内,特别是在《欧洲人权公约》范围内对隐私的法律保护。在审查了《公约》规定的尊重私人生活权利的实质性事项之后,对欧洲人权法院关于工作中的雇员隐私的四个案件进行了深入研究,并对每个案件进行了分析。通过这样的考察,阐述了工作场所监控背景下员工隐私权的保护范围。此外,还强调了有关保护隐私的某些具体问题,并在有关情况下提出了可能的解决办法。
{"title":"Big Boss is Watching You! The Right to Privacy of Employees in the Context of Workplace Surveillance","authors":"Fidan Abdurrahimli","doi":"10.2139/ssrn.3740078","DOIUrl":"https://doi.org/10.2139/ssrn.3740078","url":null,"abstract":"Workplace surveillance is a necessity which was prompt by the development of information communication technologies that offered huge opportunities to employers to monitor their employees at work and even out of work which cause strict concerns for the privacy of employees. The present thesis examines such concerns of employees arising out of workplace surveillance. The legal protection of privacy within certain systems, particularly, within the European Convention on Human Rights is considered. After examining the substantive matters of the right to respect for private life under the Convention, four cases of the European Court of Human Rights concerning employee privacy at work are studied thoroughly, and an analysis of each case is provided. By such an examination, the scope of protection of the right to privacy of employees in the context of workplace surveillance is expounded. Furthermore, certain specific problems regarding the protection of privacy are highlighted and where relevant, possible solutions are presented.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"365 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114500877","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
The New Californian Data Protection Law – In the Light of the EU General Data Protection Regulation 新加州数据保护法-根据欧盟一般数据保护条例
Pub Date : 2020-03-20 DOI: 10.2139/ssrn.3557964
T. Hoeren, Stefan Pinelli
On 1 January 2020, the data protection law of the US state of California will change fundamentally. At that time, the California Consumer Privacy Act of 2018 (CCPA) will enter into force, with far-reaching obligations for companies to handle personal information. This article aims at giving an overview on the new regulation. In addition, the new legal status in California will be outlined and compared with the model of the European Union General Data Protection Regulation (GDPR) before developing concrete guidelines for global corporations and their data protection policy.
2020年1月1日,美国加利福尼亚州的数据保护法将发生根本性变化。届时,《2018年加州消费者隐私法案》(CCPA)将生效,公司将承担处理个人信息的深远义务。本文旨在对新规定进行概述。此外,在为全球公司及其数据保护政策制定具体指导方针之前,将概述加州的新法律地位,并与欧盟通用数据保护条例(GDPR)的模式进行比较。
{"title":"The New Californian Data Protection Law – In the Light of the EU General Data Protection Regulation","authors":"T. Hoeren, Stefan Pinelli","doi":"10.2139/ssrn.3557964","DOIUrl":"https://doi.org/10.2139/ssrn.3557964","url":null,"abstract":"On 1 January 2020, the data protection law of the US state of California will change fundamentally. At that time, the California Consumer Privacy Act of 2018 (CCPA) will enter into force, with far-reaching obligations for companies to handle personal information. This article aims at giving an overview on the new regulation. In addition, the new legal status in California will be outlined and compared with the model of the European Union General Data Protection Regulation (GDPR) before developing concrete guidelines for global corporations and their data protection policy.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132310375","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Labour in the Age of AI: Why Regulation is Needed to Protect Workers 人工智能时代的劳动力:为什么需要监管来保护工人
Pub Date : 2020-02-19 DOI: 10.2139/ssrn.3541002
Aida Ponce
Superpowers, states and companies around the world are all pushing hard to win the AI race. Artificial intelligence (AI) is of strategic importance for the EU, with the European Commission recently stating that ‘artificial intelligence with a purpose can make Europe a world leader’. For this to happen, though, the EU needs to put in place the right ethical and legal framework. This Foresight Brief argues that such a framework must be solidly founded on regulation – which can be achieved by updating existing legislation – and that it must pay specific attention to the protection of workers. Workers are in a subordinate position in relation to their employers, and in the EU’s eagerness to win the AI race, their rights may be overlooked. This is why a protective and enforceable legal framework must be developed, with the participation of social partners.
世界各地的超级大国、国家和公司都在努力赢得人工智能竞赛。人工智能(AI)对欧盟具有战略重要性,欧盟委员会最近表示,“有目的的人工智能可以使欧洲成为世界领导者”。然而,要做到这一点,欧盟需要建立正确的道德和法律框架。这份前瞻性简报认为,这样一个框架必须牢固地建立在监管的基础上——这可以通过更新现有立法来实现——并且必须特别注意对工人的保护。与雇主相比,工人处于从属地位,在欧盟渴望赢得人工智能竞赛的情况下,他们的权利可能被忽视。这就是为什么必须在社会伙伴的参与下制定一个保护性和可执行的法律框架。
{"title":"Labour in the Age of AI: Why Regulation is Needed to Protect Workers","authors":"Aida Ponce","doi":"10.2139/ssrn.3541002","DOIUrl":"https://doi.org/10.2139/ssrn.3541002","url":null,"abstract":"Superpowers, states and companies around the world are all pushing hard to win the AI race. Artificial intelligence (AI) is of strategic importance for the EU, with the European Commission recently stating that ‘artificial intelligence with a purpose can make Europe a world leader’. For this to happen, though, the EU needs to put in place the right ethical and legal framework. \u0000 \u0000This Foresight Brief argues that such a framework must be solidly founded on regulation – which can be achieved by updating existing legislation – and that it must pay specific attention to the protection of workers. Workers are in a subordinate position in relation to their employers, and in the EU’s eagerness to win the AI race, their rights may be overlooked. This is why a protective and enforceable legal framework must be developed, with the participation of social partners.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117154314","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
The Myth of the Privacy Paradox 隐私悖论的神话
Pub Date : 2020-02-11 DOI: 10.2139/ssrn.3536265
Daniel J. Solove
In this article, Professor Daniel Solove deconstructs and critiques the privacy paradox and the arguments made about it. The “privacy paradox” is the phenomenon where people say that they value privacy highly, yet in their behavior relinquish their personal data for very little in exchange or fail to use measures to protect their privacy. Commentators typically make one of two types of arguments about the privacy paradox. On one side, the “behavior valuation argument” contends behavior is the best metric to evaluate how people actually value privacy. Behavior reveals that people ascribe a low value to privacy or readily trade it away for goods or services. The argument often goes on to contend that privacy regulation should be reduced. On the other side, the “behavior distortion argument” argues that people’s behavior isn’t an accurate metric of preferences because behavior is distorted by biases and heuristics, manipulation and skewing, and other factors. In contrast to both of these camps, Professor Solove argues that the privacy paradox is a myth created by faulty logic. The behavior involved in privacy paradox studies involves people making decisions about risk in very specific contexts. In contrast, people’s attitudes about their privacy concerns or how much they value privacy are much more general in nature. It is a leap in logic to generalize from people’s risk decisions involving specific personal data in specific contexts to reach broader conclusions about how people value their own privacy. The behavior in the privacy paradox studies doesn’t lead to a conclusion for less regulation. On the other hand, minimizing behavioral distortion will not cure people’s failure to protect their own privacy. It is perfectly rational for people — even without any undue influences on behavior — to fail to make good assessments of privacy risks and to fail to manage their privacy effectively. Managing one’s privacy is a vast, complex, and never-ending project that does not scale; it becomes virtually impossible to do comprehensively. Privacy regulation often seeks to give people more privacy self-management, such as the recent California Consumer Privacy Act. Professor Solove argues that giving individuals more tasks for managing their privacy will not provide effective privacy protection. Instead, regulation should employ a different strategy — focus on regulating the architecture that structures the way information is used, maintained, and transferred.
在这篇文章中,Daniel Solove教授解构和批判了隐私悖论和关于它的争论。“隐私悖论”是指人们嘴上说自己非常重视隐私,但在行为上却很少放弃自己的个人数据,或者没有采取措施保护自己的隐私。关于隐私悖论,评论员通常会提出两种观点。一方面,“行为评估论点”认为,行为是评估人们实际如何重视隐私的最佳指标。行为表明,人们认为隐私的价值很低,或者很容易用它来换取商品或服务。争论的焦点往往是应该减少对隐私的监管。另一方面,“行为扭曲论”认为,人们的行为并不是衡量偏好的准确指标,因为行为会受到偏见和启发、操纵和扭曲以及其他因素的扭曲。与这两个阵营相反,索洛夫教授认为,隐私悖论是由错误的逻辑创造的神话。隐私悖论研究中涉及的行为涉及人们在非常特定的环境中对风险做出决定。相比之下,人们对隐私问题的态度或对隐私的重视程度在本质上要普遍得多。从人们在特定情况下涉及特定个人数据的风险决策中,得出人们如何重视自己隐私的更广泛结论,这在逻辑上是一个飞跃。隐私悖论研究中的行为并没有得出减少监管的结论。另一方面,尽量减少行为扭曲并不能解决人们无法保护自己隐私的问题。对人们来说,即使没有任何不当的行为影响,不能很好地评估隐私风险,不能有效地管理自己的隐私,这是完全合理的。管理个人隐私是一项庞大、复杂、永无止境的工程,而且无法扩展;这实际上是不可能做到全面的。隐私监管往往寻求给予人们更多的隐私自我管理,比如最近的《加州消费者隐私法》。索洛夫教授认为,给个人更多的任务来管理他们的隐私,并不能提供有效的隐私保护。相反,监管应该采用一种不同的策略——专注于监管构建信息使用、维护和传输方式的架构。
{"title":"The Myth of the Privacy Paradox","authors":"Daniel J. Solove","doi":"10.2139/ssrn.3536265","DOIUrl":"https://doi.org/10.2139/ssrn.3536265","url":null,"abstract":"In this article, Professor Daniel Solove deconstructs and critiques the privacy paradox and the arguments made about it. The “privacy paradox” is the phenomenon where people say that they value privacy highly, yet in their behavior relinquish their personal data for very little in exchange or fail to use measures to protect their privacy. \u0000 \u0000Commentators typically make one of two types of arguments about the privacy paradox. On one side, the “behavior valuation argument” contends behavior is the best metric to evaluate how people actually value privacy. Behavior reveals that people ascribe a low value to privacy or readily trade it away for goods or services. The argument often goes on to contend that privacy regulation should be reduced. \u0000 \u0000On the other side, the “behavior distortion argument” argues that people’s behavior isn’t an accurate metric of preferences because behavior is distorted by biases and heuristics, manipulation and skewing, and other factors. \u0000 \u0000In contrast to both of these camps, Professor Solove argues that the privacy paradox is a myth created by faulty logic. The behavior involved in privacy paradox studies involves people making decisions about risk in very specific contexts. In contrast, people’s attitudes about their privacy concerns or how much they value privacy are much more general in nature. It is a leap in logic to generalize from people’s risk decisions involving specific personal data in specific contexts to reach broader conclusions about how people value their own privacy. \u0000 \u0000The behavior in the privacy paradox studies doesn’t lead to a conclusion for less regulation. On the other hand, minimizing behavioral distortion will not cure people’s failure to protect their own privacy. It is perfectly rational for people — even without any undue influences on behavior — to fail to make good assessments of privacy risks and to fail to manage their privacy effectively. Managing one’s privacy is a vast, complex, and never-ending project that does not scale; it becomes virtually impossible to do comprehensively. Privacy regulation often seeks to give people more privacy self-management, such as the recent California Consumer Privacy Act. Professor Solove argues that giving individuals more tasks for managing their privacy will not provide effective privacy protection. Instead, regulation should employ a different strategy — focus on regulating the architecture that structures the way information is used, maintained, and transferred.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122685125","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 107
Personal Data and Consumer Welfare in the Digital Economy 数字经济中的个人资料与消费者福利
Pub Date : 2020-02-05 DOI: 10.2139/ssrn.3545497
Smriti Parsheera, Sarang Moharir
Safeguarding user rights and maximising consumer welfare in the digital economy, particularly in the context of personal data, requires an integrated approach that cuts across the fields of competition, consumer protection, and data protection. While the legal interventions in each of these fields are geared towards securing better outcomes for individuals, often in their capacity as consumers, there are significant differences in the available tools and remedies. Current and proposed regulatory frameworks in India, however, continue with a silo-based approach offering limited scope for cross-sectional analysis of consumer welfare issues in digital markets. We argue for the need to create appropriate legal and institutional mechanisms to facilitate interactions across the fields of competition, consumer protection, and data protection policies as well as sectoral policies.
在数字经济中,特别是在个人数据的背景下,维护用户权利和最大限度地提高消费者福利,需要一种跨越竞争、消费者保护和数据保护领域的综合方法。虽然这些领域的法律干预措施旨在为个人(通常以消费者的身份)确保更好的结果,但在可用的工具和补救措施方面存在重大差异。然而,印度目前和拟议的监管框架继续采用基于竖井的方法,为数字市场中消费者福利问题的横断面分析提供了有限的范围。我们认为有必要建立适当的法律和制度机制,以促进竞争、消费者保护、数据保护政策以及部门政策等领域的互动。
{"title":"Personal Data and Consumer Welfare in the Digital Economy","authors":"Smriti Parsheera, Sarang Moharir","doi":"10.2139/ssrn.3545497","DOIUrl":"https://doi.org/10.2139/ssrn.3545497","url":null,"abstract":"Safeguarding user rights and maximising consumer welfare in the digital economy, particularly in the context of personal data, requires an integrated approach that cuts across the fields of competition, consumer protection, and data protection. While the legal interventions in each of these fields are geared towards securing better outcomes for individuals, often in their capacity as consumers, there are significant differences in the available tools and remedies. Current and proposed regulatory frameworks in India, however, continue with a silo-based approach offering limited scope for cross-sectional analysis of consumer welfare issues in digital markets. We argue for the need to create appropriate legal and institutional mechanisms to facilitate interactions across the fields of competition, consumer protection, and data protection policies as well as sectoral policies.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127841586","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Data Protection and Artificial Intelligence Law: Europe Australia Singapore - An Actual or Perceived Dichotomy? 数据保护和人工智能法律:欧洲、澳大利亚、新加坡——一个实际的或感知的二分法?
Pub Date : 2019-12-13 DOI: 10.2139/ssrn.3503392
Robert Walters, Matthew Coghlan
Artificial Intelligence (AI) is moving so rapidly policy makers, regulators, governments and the legal profession are struggling to keep up. However, AI is not new and it has been used for more than two decades. Coupled with AI, personal data, along with cyber security law, and the challenges posed by the current legal frameworks are nothing short of immense. They are, in part, at odds with each other, and are doing very different things. This paper explores some of the challenges emerging in Australia, Europe and Singapore. The challenge of the interrelationship between personal data and AI arguably begins with who has manufactured the AI. Secondly who owns the AI. Another challenge that has also emerged is defining AI. Most people are able to understand what AI is and how it is beginning to impact the economy and our daily lives. However, there is no clear legal definition of AI, because AI is so nebulous. This burgeoning area of law is going to challenge society, privacy experts, regulators, innovators of technology, as there continues to be a collision between the two. Furthermore, the collection of personal data by AI challenges the notion of where responsibility lies. That is, AI may collect, use and disclose personal data at different points along the technology chain. It will be highlighted how the current data protection laws rather than promote AI projects, largely inhibit its development. This paper identifies some of the tensions between data protection law and AI. This paper argues that there is a need for an urgent and detailed understanding of the opportunities, legal and ethical issues associated with data protection and AI. Doing so will ensure an ongoing balance between the economic, social and human rights issues that are attached to the two areas of the law.
人工智能(AI)发展如此迅速,政策制定者、监管机构、政府和法律界都在努力跟上它的步伐。然而,人工智能并不新鲜,它已经被使用了二十多年。再加上人工智能、个人数据以及网络安全法,现有法律框架带来的挑战可谓巨大。在某种程度上,它们彼此不一致,做着截然不同的事情。本文探讨了澳大利亚、欧洲和新加坡出现的一些挑战。个人数据和人工智能之间相互关系的挑战可以说始于谁制造了人工智能。其次,谁拥有人工智能。另一个挑战是如何定义人工智能。大多数人都能够理解人工智能是什么,以及它如何开始影响经济和我们的日常生活。然而,由于人工智能是如此模糊,因此没有明确的法律定义。这个蓬勃发展的法律领域将挑战社会、隐私专家、监管机构和技术创新者,因为两者之间的冲突仍在继续。此外,人工智能收集个人数据挑战了责任所在的概念。也就是说,人工智能可能会在技术链的不同节点收集、使用和披露个人数据。它将强调当前的数据保护法律如何在很大程度上抑制其发展,而不是促进人工智能项目。本文确定了数据保护法和人工智能之间的一些紧张关系。本文认为,迫切需要详细了解与数据保护和人工智能相关的机会、法律和道德问题。这样做将确保附属于这两个法律领域的经济、社会和人权问题之间的持续平衡。
{"title":"Data Protection and Artificial Intelligence Law: Europe Australia Singapore - An Actual or Perceived Dichotomy?","authors":"Robert Walters, Matthew Coghlan","doi":"10.2139/ssrn.3503392","DOIUrl":"https://doi.org/10.2139/ssrn.3503392","url":null,"abstract":"Artificial Intelligence (AI) is moving so rapidly policy makers, regulators, governments and the legal profession are struggling to keep up. However, AI is not new and it has been used for more than two decades. Coupled with AI, personal data, along with cyber security law, and the challenges posed by the current legal frameworks are nothing short of immense. They are, in part, at odds with each other, and are doing very different things. This paper explores some of the challenges emerging in Australia, Europe and Singapore. The challenge of the interrelationship between personal data and AI arguably begins with who has manufactured the AI. Secondly who owns the AI. Another challenge that has also emerged is defining AI. Most people are able to understand what AI is and how it is beginning to impact the economy and our daily lives. However, there is no clear legal definition of AI, because AI is so nebulous. This burgeoning area of law is going to challenge society, privacy experts, regulators, innovators of technology, as there continues to be a collision between the two. Furthermore, the collection of personal data by AI challenges the notion of where responsibility lies. That is, AI may collect, use and disclose personal data at different points along the technology chain. It will be highlighted how the current data protection laws rather than promote AI projects, largely inhibit its development. This paper identifies some of the tensions between data protection law and AI. This paper argues that there is a need for an urgent and detailed understanding of the opportunities, legal and ethical issues associated with data protection and AI. Doing so will ensure an ongoing balance between the economic, social and human rights issues that are attached to the two areas of the law.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129976040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
期刊
Information Privacy Law eJournal
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1