Privacy scholarship to date has failed to consider a new development in the commercial privacy landscape. Data brokers have begun to sell data products to individual consumers interested in tracking the activities of love interests, professional contacts, and other people of interest. This practice creates an avenue for a new type of privacy harm — “insider control” — which privacy scholarship has yet to recognize.U.S. privacy laws fail to protect consumers from the possibility of insider control. Apart from two noteworthy frameworks that might offer paths forward, none of the viable reforms offered by privacy scholars would meaningfully limit consumers’ vulnerability. This Note proposes changes to existing privacy doctrines in order to reduce consumers’ exposure to this new harm.
{"title":"What Happens When an Acquaintance Buys Your Data?: A New Privacy Harm in the Age of Data Brokers","authors":"Theodore Rostow","doi":"10.2139/SSRN.2870044","DOIUrl":"https://doi.org/10.2139/SSRN.2870044","url":null,"abstract":"Privacy scholarship to date has failed to consider a new development in the commercial privacy landscape. Data brokers have begun to sell data products to individual consumers interested in tracking the activities of love interests, professional contacts, and other people of interest. This practice creates an avenue for a new type of privacy harm — “insider control” — which privacy scholarship has yet to recognize.U.S. privacy laws fail to protect consumers from the possibility of insider control. Apart from two noteworthy frameworks that might offer paths forward, none of the viable reforms offered by privacy scholars would meaningfully limit consumers’ vulnerability. This Note proposes changes to existing privacy doctrines in order to reduce consumers’ exposure to this new harm.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117343809","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Not all digital fine print exculpates liability: some exhorts users to perform before the consumer relationship has soured. We promise to choose strong passwords (and hold them private); to behave civilly on social networks; to refrain from streaming shows and sports; and to avoid reverse-engineering code (or, worse, deploying deadly bots). In short: consumers are apparently regulated by digital fine print, though it’s universally assumed we don’t read it, and even if we did, we’ll never be sued for failing to perform. On reflection, this ordinary phenomenon is perplexing. Why would firms persist in deploying uncommunicative behavioral spurs? The conventional answer is that fine print acts as an option, drafted by dull, monopolist, lawyers. Through investigation of several sharing economy firms, and discussions with a variety of lawyers in this space, I show that this account is incomplete. Indeed, I identify and explore examples of innovative fine print that appears to really communicate with and manage users. These firms have cajoled using contracts by trading on their brands and identities, and by giving up on certain exculpatory defenses common to digital agreements. I argue that the result is a new form of relational contracting, taking on attributes of both mass market adhesion contracts and more long-term deals.
{"title":"Relational Contracts of Adhesion","authors":"David Hoffman","doi":"10.2139/SSRN.3008687","DOIUrl":"https://doi.org/10.2139/SSRN.3008687","url":null,"abstract":"Not all digital fine print exculpates liability: some exhorts users to perform before the consumer relationship has soured. We promise to choose strong passwords (and hold them private); to behave civilly on social networks; to refrain from streaming shows and sports; and to avoid reverse-engineering code (or, worse, deploying deadly bots). In short: consumers are apparently regulated by digital fine print, though it’s universally assumed we don’t read it, and even if we did, we’ll never be sued for failing to perform. \u0000On reflection, this ordinary phenomenon is perplexing. Why would firms persist in deploying uncommunicative behavioral spurs? The conventional answer is that fine print acts as an option, drafted by dull, monopolist, lawyers. Through investigation of several sharing economy firms, and discussions with a variety of lawyers in this space, I show that this account is incomplete. Indeed, I identify and explore examples of innovative fine print that appears to really communicate with and manage users. \u0000These firms have cajoled using contracts by trading on their brands and identities, and by giving up on certain exculpatory defenses common to digital agreements. I argue that the result is a new form of relational contracting, taking on attributes of both mass market adhesion contracts and more long-term deals.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124575975","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article discusses the relationship between privacy and confidentiality, the different forms of privacy, including informational privacy, the contrast with confidentiality as ownership of trade secrets, privacy compared with defamation, publicity rights and merchandising as the ownership of image, the function of a trade mark and merchandising through trade marks, and the ownership of intangibles.
{"title":"Privacy, Confidentiality and Property","authors":"Peter Jaffey","doi":"10.2139/ssrn.3801736","DOIUrl":"https://doi.org/10.2139/ssrn.3801736","url":null,"abstract":"This article discusses the relationship between privacy and confidentiality, the different forms of privacy, including informational privacy, the contrast with confidentiality as ownership of trade secrets, privacy compared with defamation, publicity rights and merchandising as the ownership of image, the function of a trade mark and merchandising through trade marks, and the ownership of intangibles.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124572703","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This Case Digest summarizes the 2011 California Supreme Court case of Pineda v. Williams-Sonoma Stores, Inc. Plaintiff Pineda alleged, in part, that Williams-Sonoma violated the Song-Beverly Credit Card Act of 1971 by recording consumers' ZIP codes during credit card transactions. The Act places limitations upon the type and extent of information retailers — as well as other persons and businesses — may request of consumers using credit cards, and restricts the use of a purchaser's "personal identification information." The trial court held that a ZIP code did not constitute "personal identification information" as used in section 1747.08 of the California Civil Code. After the court of appeal affirmed, the Supreme Court of California granted review, reversed the trial court's holding, and remanded the case for further proceedings. The court held that "personal identification information," as the term is used in section 1747.08, includes a cardholder's ZIP code. As such, requesting and recording a cardholder's ZIP code, without more, violates the Credit Card Act.
{"title":"Case Digest: Pineda v. Williams-Sonoma Stores, Inc.","authors":"Matthew Adam Susson","doi":"10.2139/SSRN.2027703","DOIUrl":"https://doi.org/10.2139/SSRN.2027703","url":null,"abstract":"This Case Digest summarizes the 2011 California Supreme Court case of Pineda v. Williams-Sonoma Stores, Inc. Plaintiff Pineda alleged, in part, that Williams-Sonoma violated the Song-Beverly Credit Card Act of 1971 by recording consumers' ZIP codes during credit card transactions. The Act places limitations upon the type and extent of information retailers — as well as other persons and businesses — may request of consumers using credit cards, and restricts the use of a purchaser's \"personal identification information.\" The trial court held that a ZIP code did not constitute \"personal identification information\" as used in section 1747.08 of the California Civil Code. After the court of appeal affirmed, the Supreme Court of California granted review, reversed the trial court's holding, and remanded the case for further proceedings. The court held that \"personal identification information,\" as the term is used in section 1747.08, includes a cardholder's ZIP code. As such, requesting and recording a cardholder's ZIP code, without more, violates the Credit Card Act.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129953185","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1007/978-3-319-74639-5_14
J. Francis, L. Francis
{"title":"Privacy, Employment, and Dignity","authors":"J. Francis, L. Francis","doi":"10.1007/978-3-319-74639-5_14","DOIUrl":"https://doi.org/10.1007/978-3-319-74639-5_14","url":null,"abstract":"","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130290792","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Public bodies and agencies increasingly seek to use new forms of data analysis in order to provide 'better public services'. These reforms have consisted of digital service transformations generally aimed at 'improving the experience of the citizen', 'making government more efficient' and 'boosting business and the wider economy'. More recently however, there has been a push to use administrative data to build algorithmic models, often using machine learning, to help make day-to-day operational decisions in the management and delivery of public services rather than providing general policy evidence. This chapter asks several questions relating to this. What are the drivers of these new approaches? Is public sector machine learning a smooth continuation of e-Government, or does it pose fundamentally different challenge to practices of public administration? And how are public management decisions and practices at different levels enacted when machine learning solutions are implemented in the public sector? Focussing on different levels of government: the macro, the meso, and the 'street-level', we map out and analyse the current efforts to frame and standardise machine learning in the public sector, noting that they raise several concerns around the skills, capacities, processes and practices governments currently employ. The forms of these are likely to have value-laden, political consequences worthy of significant scholarly attention.
{"title":"Administration by Algorithm? Public Management Meets Public Sector Machine Learning","authors":"Michael Veale, I. Brass","doi":"10.31235/osf.io/mwhnb","DOIUrl":"https://doi.org/10.31235/osf.io/mwhnb","url":null,"abstract":"Public bodies and agencies increasingly seek to use new forms of data analysis in order to provide 'better public services'. These reforms have consisted of digital service transformations generally aimed at 'improving the experience of the citizen', 'making government more efficient' and 'boosting business and the wider economy'. More recently however, there has been a push to use administrative data to build algorithmic models, often using machine learning, to help make day-to-day operational decisions in the management and delivery of public services rather than providing general policy evidence. This chapter asks several questions relating to this. What are the drivers of these new approaches? Is public sector machine learning a smooth continuation of e-Government, or does it pose fundamentally different challenge to practices of public administration? And how are public management decisions and practices at different levels enacted when machine learning solutions are implemented in the public sector? Focussing on different levels of government: the macro, the meso, and the 'street-level', we map out and analyse the current efforts to frame and standardise machine learning in the public sector, noting that they raise several concerns around the skills, capacities, processes and practices governments currently employ. The forms of these are likely to have value-laden, political consequences worthy of significant scholarly attention.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122206759","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Warren, R. Bayley, Colin J. Bennett, A. Charlesworth, R. Clarke, C. Oppenheim
This paper builds on original work undertaken as part of a team of researchers into Privacy Impact Assessments (PIAs), defined as a systematic risk assessment tool that can be usefully integrated into decision-making processes. The team were commissioned by the UK Information Commissioner’s Office (ICO) in June 2007 to develop a study of PIAs in overseas jurisdictions and a handbook to guide UK organisations through the PIA process. This research has subsequently attracted interest in the UK and overseas. PIAs are now mandatory for all UK central government departments. In this paper, the development of the project team’s PIA methodology and subsequent user experiences led to a key project output, the PIA handbook. The handbook has become a significant part of the privacy ‘toolkit’ and has impacted on public policy. Some important lessons from PIAs conducted in the UK and overseas are identified. Finally, areas are outlined for further development.
{"title":"Privacy Impact Assessments: The UK Experience","authors":"A. Warren, R. Bayley, Colin J. Bennett, A. Charlesworth, R. Clarke, C. Oppenheim","doi":"10.2139/SSRN.1606762","DOIUrl":"https://doi.org/10.2139/SSRN.1606762","url":null,"abstract":"This paper builds on original work undertaken as part of a team of researchers into Privacy Impact Assessments (PIAs), defined as a systematic risk assessment tool that can be usefully integrated into decision-making processes. The team were commissioned by the UK Information Commissioner’s Office (ICO) in June 2007 to develop a study of PIAs in overseas jurisdictions and a handbook to guide UK organisations through the PIA process. This research has subsequently attracted interest in the UK and overseas. PIAs are now mandatory for all UK central government departments. In this paper, the development of the project team’s PIA methodology and subsequent user experiences led to a key project output, the PIA handbook. The handbook has become a significant part of the privacy ‘toolkit’ and has impacted on public policy. Some important lessons from PIAs conducted in the UK and overseas are identified. Finally, areas are outlined for further development.","PeriodicalId":179517,"journal":{"name":"Information Privacy Law eJournal","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116860369","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}