Pub Date : 2018-03-15DOI: 10.11575/sppp.v11i0.43356
Goran Pešić
Cyber-crime is growing exponentially and Canadian governments at all levels have not kept pace quickly enough to protect both themselves and private enterprise. Evolving technology allows for ever-more sophisticated cyber-threats to intellectual property, but some businesses and governments have neither changed their pre-internet thinking nor established adequate safeguards. Protection should start with educational campaigns about the scope and varieties of risk that permeate the private sector, e-commerce and smart cities using the internet of things. Thirty years ago, just 32 per cent of the market value of Standard & Poor’s 500 companies was based on intangible assets, mainly intellectual property. Today, that figure stands at 80 per cent and protecting those assets from cyber-crime is of vital importance. While cyber-criminals look to make money off of phishing scams, their interests have also extended to infiltrating proprietary industrial designs, resource management and information affecting acquisitions. The fact that some countries see this type of crime as a normal way to gain access to foreign business information is often poorly understood by Canadian businesses accustomed to functioning under much higher ethical standards. The e-commerce realm faces its own cyber-threats including those affecting privacy, data sovereignty, location of data centres, data security and legislation. E-commerce merchants must protect themselves by ensuring the security of their clients’ computers, communication channels, web servers and data encryption. It sounds daunting, but it shouldn’t be. Merchants can take steps such as doing risk assessments, developing security policies, establishing a single point of security oversight, instituting authentication processes using biometrics, auditing security and maintaining an emergency reporting system. Government can assist with cyber-security in Canada’s private sector through awareness campaigns, rewarding businesses for best practices, providing tax credits to offset the cost of security measures, and offering preferential lending and insurance deals from government institutions. The federal government’s 2015 Digital Privacy Act was a good first step, but there is much territory left to be covered. The act offers little assistance in making the leap from a pre-internet governmental model of doing business with the private sector. Nor does it acknowledge the full costs organizations must face when contemplating improving their cyber-security. The growth of smart cities, connected to the internet of things, creates new susceptibilities to cyber-crime. By 2021, there will be approximately 28 billion internet-connected devices globally and 16 billion of those will be related to the internet of things. However, smart cities appear to be low on the list of cyber-security priorities at all levels of government. There is a lack of local guidance and commitment, an absence of funding programs and tax incentives for
{"title":"Surviving and Thriving in the Digital Economy","authors":"Goran Pešić","doi":"10.11575/sppp.v11i0.43356","DOIUrl":"https://doi.org/10.11575/sppp.v11i0.43356","url":null,"abstract":"Cyber-crime is growing exponentially and Canadian governments at all levels have not kept pace quickly enough to protect both themselves and private enterprise. Evolving technology allows for ever-more sophisticated cyber-threats to intellectual property, but some businesses and governments have neither changed their pre-internet thinking nor established adequate safeguards. Protection should start with educational campaigns about the scope and varieties of risk that permeate the private sector, e-commerce and smart cities using the internet of things. Thirty years ago, just 32 per cent of the market value of Standard & Poor’s 500 companies was based on intangible assets, mainly intellectual property. Today, that figure stands at 80 per cent and protecting those assets from cyber-crime is of vital importance. While cyber-criminals look to make money off of phishing scams, their interests have also extended to infiltrating proprietary industrial designs, resource management and information affecting acquisitions. The fact that some countries see this type of crime as a normal way to gain access to foreign business information is often poorly understood by Canadian businesses accustomed to functioning under much higher ethical standards. The e-commerce realm faces its own cyber-threats including those affecting privacy, data sovereignty, location of data centres, data security and legislation. E-commerce merchants must protect themselves by ensuring the security of their clients’ computers, communication channels, web servers and data encryption. It sounds daunting, but it shouldn’t be. Merchants can take steps such as doing risk assessments, developing security policies, establishing a single point of security oversight, instituting authentication processes using biometrics, auditing security and maintaining an emergency reporting system. Government can assist with cyber-security in Canada’s private sector through awareness campaigns, rewarding businesses for best practices, providing tax credits to offset the cost of security measures, and offering preferential lending and insurance deals from government institutions. The federal government’s 2015 Digital Privacy Act was a good first step, but there is much territory left to be covered. The act offers little assistance in making the leap from a pre-internet governmental model of doing business with the private sector. Nor does it acknowledge the full costs organizations must face when contemplating improving their cyber-security. The growth of smart cities, connected to the internet of things, creates new susceptibilities to cyber-crime. By 2021, there will be approximately 28 billion internet-connected devices globally and 16 billion of those will be related to the internet of things. However, smart cities appear to be low on the list of cyber-security priorities at all levels of government. There is a lack of local guidance and commitment, an absence of funding programs and tax incentives for ","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121892966","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The field of consumer Internet of Things (IoT) has exploded as business and researchers have sought to not only develop Internet-connected products but also define the common structure in which IoT devices will operate, including technological standards and responsive architectures. Yet, consumer IoT continues to present a host of potential risks to consumers, cascading from the multidimensional nature of IoT devices: IoT combines well-known consumer products with cutting-edge infrastructures including big data solutions, distributed data storage or “cloud,” and artificial intelligence (AI) utilities. The consumer device is no longer only the product, it is the product, the data, the algorithms, and the infrastructure. Consumer products have shifted from analog to connected technologies, introducing new risks for consumers related to personal privacy, safety issues, and potential for discriminatory data. Broad, ubiquitous data collection, internet connectivity, predictive algorithms, and overall device functionality opacity threaten to undermine IoT market benefits by causing potential consumer injury: broad unfairness and disparate impact, data breaches, physical safety issues, and property damage. Existing regulatory regimes have not anticipated these damages to effectively avoid injury, and it is yet unknown how existing products liability, common law civil recovery under contracts or torts schemes, and due process procedures will apply to these products and the data they process. This Article explores the technology and market of IoT, potential consumer impacts resulting from a lack of consistent and complete legal framework, whether IoT regulation is appropriate, and how the United States can balance market needs for innovation with consistent oversight for IoT manufacturers and distributors.
{"title":"Regulating the IoT: Discrimination, Privacy, and Cybersecurity in the Artificial Intelligence Age","authors":"Charlotte Tschider","doi":"10.2139/ssrn.3129557","DOIUrl":"https://doi.org/10.2139/ssrn.3129557","url":null,"abstract":"The field of consumer Internet of Things (IoT) has exploded as business and researchers have sought to not only develop Internet-connected products but also define the common structure in which IoT devices will operate, including technological standards and responsive architectures. Yet, consumer IoT continues to present a host of potential risks to consumers, cascading from the multidimensional nature of IoT devices: IoT combines well-known consumer products with cutting-edge infrastructures including big data solutions, distributed data storage or “cloud,” and artificial intelligence (AI) utilities. The consumer device is no longer only the product, it is the product, the data, the algorithms, and the infrastructure. Consumer products have shifted from analog to connected technologies, introducing new risks for consumers related to personal privacy, safety issues, and potential for discriminatory data. Broad, ubiquitous data collection, internet connectivity, predictive algorithms, and overall device functionality opacity threaten to undermine IoT market benefits by causing potential consumer injury: broad unfairness and disparate impact, data breaches, physical safety issues, and property damage. Existing regulatory regimes have not anticipated these damages to effectively avoid injury, and it is yet unknown how existing products liability, common law civil recovery under contracts or torts schemes, and due process procedures will apply to these products and the data they process. This Article explores the technology and market of IoT, potential consumer impacts resulting from a lack of consistent and complete legal framework, whether IoT regulation is appropriate, and how the United States can balance market needs for innovation with consistent oversight for IoT manufacturers and distributors.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"6 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120916920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Informational privacy is the ability to govern when others may collect your information and how they may use it. Norms can, and do, provide such governance, as Helen Nissenbaum’s seminal work shows. The relevant norms are informational norms, social norms that govern the collection, use, and distribution of information. With noteworthy exceptions (Woodrow Hartzog and Neil Richards, for example), contemporary discussions of privacy rarely mention informational norms, or at best assign them a peripheral role. We claim they should play a central role. Our argument is that ensuring adequate informational privacy is (at least in part) a collective action problem. Norms can, and often do, solve collective action problems. Further, informational norms currently do solve a wide range of important collective action problems centered around privacy. Shouldn’t informational norms take center stage in proposals about informational privacy? We argue they should by answering three objections to giving them that role. (1) Lack of norms: Rapid advances in technology have created a wide variety of situations for which are not governed by relevant norms. (2) Disagreement about norms: Even if relevant norms exist, lack of agreement about their content makes them a poor foundation on which to build public policy. (3) Lack of an adequate theory: Even if norms exist and their content is uncontroversial, norms are a poor tool for public policy because there is no adequate theory that allows one to make accurate predictions about the causes and effects of norms. The first two objections have relatively easy answers. The third is fundamental. We outline a theory that treats norm-created informational privacy as a commons—a special kind of commons, a common pool resource. We thereby link norm-created privacy to a rich body of empirical and theoretical work. We hope the resulting theory of norm-created governance of information flows contributes to the understanding of privacy that Neil Richards and Jonathan King call for in Big Data Ethics: “privacy in the age of big data should be . . . understood as the need to expand the rules we use to govern the flows of personal information.”
信息隐私是指管理他人何时可以收集您的信息以及如何使用这些信息的能力。正如海伦·尼森鲍姆(Helen Nissenbaum)的开创性著作所显示的那样,规范能够、也确实提供了这样的治理。相关规范是信息规范,即管理信息收集、使用和分发的社会规范。除了值得注意的例外(例如,伍德罗·哈特佐格和尼尔·理查兹),当代关于隐私的讨论很少提及信息规范,或者至多将其置于次要地位。我们认为他们应该发挥核心作用。我们的论点是,确保足够的信息隐私(至少部分)是一个集体行动问题。规范能够而且经常解决集体行动问题。此外,信息规范目前确实解决了一系列以隐私为中心的重要集体行动问题。在有关信息隐私的提案中,信息规范不应该占据中心位置吗?我们通过回答反对赋予他们这一角色的三个反对意见,认为他们应该这样做。(1)缺乏规范:技术的快速进步造成了各种各样的情况,而这些情况不受相关规范的约束。(2)规范的分歧:即使存在相关规范,但由于对其内容缺乏共识,使其成为构建公共政策的不良基础。(3)缺乏适当的理论:即使规范存在,其内容也没有争议,规范也是公共政策的一个糟糕工具,因为没有适当的理论允许人们对规范的因果做出准确的预测。前两个反对意见的答案相对简单。第三点是最根本的。我们概述了一种理论,它将规范创建的信息隐私视为一种公共资源——一种特殊的公共资源,一种公共资源池。因此,我们将规范创造的隐私与丰富的实证和理论工作联系起来。我们希望由此产生的规范创建的信息流治理理论有助于理解尼尔·理查兹(Neil Richards)和乔纳森·金(Jonathan King)在《大数据伦理学》(Big Data Ethics)中所呼吁的隐私:“大数据时代的隐私应该是……理解为需要扩大我们用来管理个人信息流动的规则。”
{"title":"Why Are Norms Ignored? Collective Action and the Privacy Commons","authors":"R. Sloan, Richard Warner","doi":"10.2139/SSRN.3125832","DOIUrl":"https://doi.org/10.2139/SSRN.3125832","url":null,"abstract":"Informational privacy is the ability to govern when others may collect your information and how they may use it. Norms can, and do, provide such governance, as Helen Nissenbaum’s seminal work shows. The relevant norms are informational norms, social norms that govern the collection, use, and distribution of information. With noteworthy exceptions (Woodrow Hartzog and Neil Richards, for example), contemporary discussions of privacy rarely mention informational norms, or at best assign them a peripheral role. We claim they should play a central role. Our argument is that ensuring adequate informational privacy is (at least in part) a collective action problem. Norms can, and often do, solve collective action problems. Further, informational norms currently do solve a wide range of important collective action problems centered around privacy. Shouldn’t informational norms take center stage in proposals about informational privacy? \u0000We argue they should by answering three objections to giving them that role. (1) Lack of norms: Rapid advances in technology have created a wide variety of situations for which are not governed by relevant norms. (2) Disagreement about norms: Even if relevant norms exist, lack of agreement about their content makes them a poor foundation on which to build public policy. (3) Lack of an adequate theory: Even if norms exist and their content is uncontroversial, norms are a poor tool for public policy because there is no adequate theory that allows one to make accurate predictions about the causes and effects of norms. The first two objections have relatively easy answers. The third is fundamental. We outline a theory that treats norm-created informational privacy as a commons—a special kind of commons, a common pool resource. We thereby link norm-created privacy to a rich body of empirical and theoretical work. We hope the resulting theory of norm-created governance of information flows contributes to the understanding of privacy that Neil Richards and Jonathan King call for in Big Data Ethics: “privacy in the age of big data should be . . . understood as the need to expand the rules we use to govern the flows of personal information.”","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132917473","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The right of access occupies a central role in EU data protection law's arsenal of data subject empowerment measures. It can be seen as a necessary enabler for most other data subject rights as well as an important role in monitoring operations and (en)forcing compliance. Despite some high-profile revelations regarding unsavoury data processing practices over the past few years, access rights still appear to be underused and not properly accommodated. It is especially this last hypothesis we tried to investigate and substantiate through a legal empirical study. During the first half of 2017, around sixty information society service providers were contacted with data subject access requests. Eventually, the study confirmed the general suspicion that access rights are by and large not adequately accommodated. The systematic approach did allow for a more granular identification of key issues and broader problematic trends. Notably, it uncovered an often-flagrant lack of awareness; organisation; motivation; and harmonisation. Despite the poor results of the empirical study, we still believe there to be an important role for data subject empowerment tools in a hyper-complex, automated and ubiquitous data-processing ecosystem. Even if only used marginally, they provide a checks and balances infrastructure overseeing controllers' processing operations, both on an individual basis as well as collectively. The empirical findings also allow identifying concrete suggestions aimed at controllers, such as relatively easy fixes in privacy policies and access rights templates.
{"title":"Shattering One-Way Mirrors. Data Subject Access Rights in Practice","authors":"J. Ausloos, Pierre Dewitte","doi":"10.1093/IDPL/IPY001","DOIUrl":"https://doi.org/10.1093/IDPL/IPY001","url":null,"abstract":"The right of access occupies a central role in EU data protection law's arsenal of data subject empowerment measures. It can be seen as a necessary enabler for most other data subject rights as well as an important role in monitoring operations and (en)forcing compliance. Despite some high-profile revelations regarding unsavoury data processing practices over the past few years, access rights still appear to be underused and not properly accommodated. It is especially this last hypothesis we tried to investigate and substantiate through a legal empirical study. During the first half of 2017, around sixty information society service providers were contacted with data subject access requests. Eventually, the study confirmed the general suspicion that access rights are by and large not adequately accommodated. The systematic approach did allow for a more granular identification of key issues and broader problematic trends. Notably, it uncovered an often-flagrant lack of awareness; organisation; motivation; and harmonisation. Despite the poor results of the empirical study, we still believe there to be an important role for data subject empowerment tools in a hyper-complex, automated and ubiquitous data-processing ecosystem. Even if only used marginally, they provide a checks and balances infrastructure overseeing controllers' processing operations, both on an individual basis as well as collectively. The empirical findings also allow identifying concrete suggestions aimed at controllers, such as relatively easy fixes in privacy policies and access rights templates.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130814121","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article analyses, defines, and refines the concepts of ownership and personal data to explore their compatibility in the context of EU law. It critically examines the traditional dividing line between personal and non-personal data and argues for a strict conceptual separation of personal data from personal information. The article also considers whether, and to what extent, the concept of ownership can be applied to personal data in the context of the Internet of Things (IoT). This consideration is framed around two main approaches shaping all ownership theories: a bottom-up and top-down approach. Via these dual lenses, the article reviews existing debates relating to four elements supporting introduction of ownership of personal data, namely the elements of control, protection, valuation, and allocation of personal data. It then explores the explanatory advantages and disadvantages of the two approaches in relation to each of these elements as well as to ownership of personal data in IoT at large. Lastly, the article outlines a revised approach to ownership of personal data in IoT that may serve as a blueprint for future work in this area and inform regulatory and policy debates.
{"title":"Ownership of Personal Data in the Internet of Things","authors":"V. Janeček","doi":"10.2139/ssrn.3111047","DOIUrl":"https://doi.org/10.2139/ssrn.3111047","url":null,"abstract":"This article analyses, defines, and refines the concepts of ownership and personal data to explore their compatibility in the context of EU law. It critically examines the traditional dividing line between personal and non-personal data and argues for a strict conceptual separation of personal data from personal information. The article also considers whether, and to what extent, the concept of ownership can be applied to personal data in the context of the Internet of Things (IoT). This consideration is framed around two main approaches shaping all ownership theories: a bottom-up and top-down approach. Via these dual lenses, the article reviews existing debates relating to four elements supporting introduction of ownership of personal data, namely the elements of control, protection, valuation, and allocation of personal data. It then explores the explanatory advantages and disadvantages of the two approaches in relation to each of these elements as well as to ownership of personal data in IoT at large. Lastly, the article outlines a revised approach to ownership of personal data in IoT that may serve as a blueprint for future work in this area and inform regulatory and policy debates.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134316119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This Article discusses how governments, intellectual property owners, and technology companies use the law to disrupt access to intermediaries used by financially-motivated cybercriminals. Just like licit businesses, illicit firms rely on intermediaries to advertise, sell and deliver products, collect payments, and maintain a reputation. Recognizing these needs, law enforcers use the courts, administrative procedures, and self-regulatory frameworks to execute a deterrence by denial strategy. Enforcers of the law seize the financial rewards and infrastructures necessary for the operation of illicit firms to deter their presence. Policing illicit actors through their intermediaries raises due process and fairness concerns because service-providing companies may not be aware of the criminal activity, and because enforcement actions have consequences for consumers and other, licit firms. Yet, achieving direct deterrence by punishment suffers from jurisdictional and resource constraints, leaving enforcers with few other options for remedy. This Article integrates literature from the computer science and legal fields to explain enforcers' interventions, explore their efficacy, and evaluate the merits and demerits of enforcement efforts focused on the intermediaries used by financially-motivated cybercriminals.
{"title":"Deterring Cybercrime: Focus on Intermediaries","authors":"Aniket Kesari, C. Hoofnagle, Damon McCoy","doi":"10.15779/Z387M04086","DOIUrl":"https://doi.org/10.15779/Z387M04086","url":null,"abstract":"This Article discusses how governments, intellectual property owners, and technology companies use the law to disrupt access to intermediaries used by financially-motivated cybercriminals. Just like licit businesses, illicit firms rely on intermediaries to advertise, sell and deliver products, collect payments, and maintain a reputation. Recognizing these needs, law enforcers use the courts, administrative procedures, and self-regulatory frameworks to execute a deterrence by denial strategy. Enforcers of the law seize the financial rewards and infrastructures necessary for the operation of illicit firms to deter their presence. \u0000Policing illicit actors through their intermediaries raises due process and fairness concerns because service-providing companies may not be aware of the criminal activity, and because enforcement actions have consequences for consumers and other, licit firms. Yet, achieving direct deterrence by punishment suffers from jurisdictional and resource constraints, leaving enforcers with few other options for remedy. This Article integrates literature from the computer science and legal fields to explain enforcers' interventions, explore their efficacy, and evaluate the merits and demerits of enforcement efforts focused on the intermediaries used by financially-motivated cybercriminals.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122500843","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The construct of an information dichotomy has played a defining role in regulating privacy: information deemed private or sensitive typically earns high levels of protection, while lower levels of protection are accorded to information deemed public or non-sensitive. Challenging this dichotomy, the theory of contextual integrity associates privacy with complex typologies of information, each connected with respective social contexts. Moreover, it contends that information type is merely one among several variables that shape people’s privacy expectations and underpin privacy’s normative foundations. Other contextual variables include key actors - information subjects, senders, and recipients - as well as the principles under which information is transmitted, such as whether with subjects’ consent, as bought and sold, as required by law, and so forth. Prior work revealed the systematic impact of these other variables on privacy assessments, thereby debunking the defining effects of so-called private information. In this paper, we shine a light on the opposite effect, challenging conventional assumptions about public information. The paper reports on a series of studies, which probe attitudes and expectations regarding information that has been deemed public. Public records established through the historical practice of federal, state, and local agencies, as a case in point, are afforded little privacy protection, or possibly none at all. Motivated by progressive digitization and creation of online portals through which these records have been made publicly accessible our work underscores the need for more concentrated and nuanced privacy assessments, even more urgent in the face of vigorous open data initiatives, which call on federal, state, and local agencies to provide access to government records in both human and machine readable forms. Within a stream of research suggesting possible guard rails for open data initiatives, our work, guided by the theory of contextual integrity, provides insight into the factors systematically shaping individuals’ expectations and normative judgments concerning appropriate uses of and terms of access to information. Using a factorial vignette survey, we asked respondents to rate the appropriateness of a series of scenarios in which contextual elements were systematically varied; these elements included the data recipient (e.g. bank, employer, friend,.), the data subject, and the source, or sender, of the information (e.g. individual, government, data broker). Because the object of this study was to highlight the complexity of people’s privacy expectations regarding so-called public information, information types were drawn from data fields frequently held in public government records (e.g. voter registration, marital status, criminal standing, and real property ownership). Our findings are noteworthy on both theoretical and practical grounds. In the first place, they reinforce key assertions of contextual integ
{"title":"Privacy Interests In Public Records: An Empirical Investigation","authors":"Kirsten E. Martin, H. Nissenbaum","doi":"10.2139/ssrn.2875720","DOIUrl":"https://doi.org/10.2139/ssrn.2875720","url":null,"abstract":"The construct of an information dichotomy has played a defining role in regulating privacy: information deemed private or sensitive typically earns high levels of protection, while lower levels of protection are accorded to information deemed public or non-sensitive. Challenging this dichotomy, the theory of contextual integrity associates privacy with complex typologies of information, each connected with respective social contexts. Moreover, it contends that information type is merely one among several variables that shape people’s privacy expectations and underpin privacy’s normative foundations. Other contextual variables include key actors - information subjects, senders, and recipients - as well as the principles under which information is transmitted, such as whether with subjects’ consent, as bought and sold, as required by law, and so forth. Prior work revealed the systematic impact of these other variables on privacy assessments, thereby debunking the defining effects of so-called private information. In this paper, we shine a light on the opposite effect, challenging conventional assumptions about public information. The paper reports on a series of studies, which probe attitudes and expectations regarding information that has been deemed public. Public records established through the historical practice of federal, state, and local agencies, as a case in point, are afforded little privacy protection, or possibly none at all. Motivated by progressive digitization and creation of online portals through which these records have been made publicly accessible our work underscores the need for more concentrated and nuanced privacy assessments, even more urgent in the face of vigorous open data initiatives, which call on federal, state, and local agencies to provide access to government records in both human and machine readable forms. Within a stream of research suggesting possible guard rails for open data initiatives, our work, guided by the theory of contextual integrity, provides insight into the factors systematically shaping individuals’ expectations and normative judgments concerning appropriate uses of and terms of access to information. Using a factorial vignette survey, we asked respondents to rate the appropriateness of a series of scenarios in which contextual elements were systematically varied; these elements included the data recipient (e.g. bank, employer, friend,.), the data subject, and the source, or sender, of the information (e.g. individual, government, data broker). Because the object of this study was to highlight the complexity of people’s privacy expectations regarding so-called public information, information types were drawn from data fields frequently held in public government records (e.g. voter registration, marital status, criminal standing, and real property ownership). Our findings are noteworthy on both theoretical and practical grounds. In the first place, they reinforce key assertions of contextual integ","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122541102","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We investigate changes to the value that individuals place on the online disclosure of their private information in the presence of multiple privacy factors. We capture individuals’ willingness-to-...
{"title":"Relative Privacy Valuations Under Varying Disclosure Characteristics","authors":"J. Buckman, J. Bockstedt, Matthew J. Hashim","doi":"10.2139/ssrn.3033975","DOIUrl":"https://doi.org/10.2139/ssrn.3033975","url":null,"abstract":"We investigate changes to the value that individuals place on the online disclosure of their private information in the presence of multiple privacy factors. We capture individuals’ willingness-to-...","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128106028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In the early 2000s, we surveyed and analyzed the global repertoire of policy instruments deployed to protect personal data in “The Governance of Privacy: Policy Instruments in Global Perspective.” In this article, we explore how those instruments have changed as a result of 15 years of fundamental transformations in information technologies, and the new digital economy that they have brought in their wake. We review the contemporary range of transnational, regulatory, self-regulatory and technical instruments according to the same framework, and conclude that the types of policy instrument have remained relatively stable, even though they are now deployed on a global scale, rather than in association with particular national legal and administrative traditions. While the labels remain the same, however, the conceptual foundations for their legitimation and justification are shifting as a greater emphasis on accountability, risk, ethics and the social/political value of privacy have gained purchase in the policy community. Our exercise in self-reflection demonstrates both continuity and change within the governance of privacy, and displays how we would have tackled the same research project today. As a broader case study of regulation, it also highlights the importance of going beyond the technical and instrumental labels. The change or stability of policy instruments do not take place in isolation from the wider conceptualizations that shape their meaning, purpose and effect.
{"title":"Revisiting 'The Governance of Privacy': Contemporary Policy Instruments in Global Perspective","authors":"Colin J. Bennett, C. Raab","doi":"10.2139/ssrn.2972086","DOIUrl":"https://doi.org/10.2139/ssrn.2972086","url":null,"abstract":"In the early 2000s, we surveyed and analyzed the global repertoire of policy instruments deployed to protect personal data in “The Governance of Privacy: Policy Instruments in Global Perspective.” In this article, we explore how those instruments have changed as a result of 15 years of fundamental transformations in information technologies, and the new digital economy that they have brought in their wake. We review the contemporary range of transnational, regulatory, self-regulatory and technical instruments according to the same framework, and conclude that the types of policy instrument have remained relatively stable, even though they are now deployed on a global scale, rather than in association with particular national legal and administrative traditions. While the labels remain the same, however, the conceptual foundations for their legitimation and justification are shifting as a greater emphasis on accountability, risk, ethics and the social/political value of privacy have gained purchase in the policy community. Our exercise in self-reflection demonstrates both continuity and change within the governance of privacy, and displays how we would have tackled the same research project today. As a broader case study of regulation, it also highlights the importance of going beyond the technical and instrumental labels. The change or stability of policy instruments do not take place in isolation from the wider conceptualizations that shape their meaning, purpose and effect.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115400453","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
With this submission, we would like to respond, in our personal capacity as researchers, to two issues raised by the European Commission’s public consultation on ‘Building a European Data Economy’, namely: the development of a possible future EU framework for data access (section 1), and the issue of portability in the context of non-personal data (section 2). Regarding both issues, our submission aims to convey two main messages: • A cautious, evidence-based approach should be taken in devising any possible future legislative or non-legislative measures for the European data economy; and • Since a number of legal fields and policy initiatives overlap in this context, there is a strong need to ensure consistency of policies and coherence of law on the European as well as national level.
{"title":"Response to the Public Consultation on ‘Building a European Data Economy’","authors":"Inge Graef, Martin Husovec","doi":"10.2139/SSRN.2958287","DOIUrl":"https://doi.org/10.2139/SSRN.2958287","url":null,"abstract":"With this submission, we would like to respond, in our personal capacity as researchers, to two issues raised by the European Commission’s public consultation on ‘Building a European Data Economy’, namely: the development of a possible future EU framework for data access (section 1), and the issue of portability in the context of non-personal data (section 2). \u0000Regarding both issues, our submission aims to convey two main messages: \u0000• A cautious, evidence-based approach should be taken in devising any possible future legislative or non-legislative measures for the European data economy; and \u0000• Since a number of legal fields and policy initiatives overlap in this context, there is a strong need to ensure consistency of policies and coherence of law on the European as well as national level.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123868582","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}