Predictive analytics use a method known as data mining to identify trends, patterns, or relationships among data, which can then be used to develop a predictive model. Data mining itself relies upon big data, which is “big” not solely because of its size but also because its analytical potential is qualitatively different. “Big data” analysis allows organizations, including government and businesses, to combine diverse digital datasets and then use statistics and other data mining techniques to extract from them both hidden information and surprising correlations. These data are not necessarily tracking transactional records of atomized behavior, such as the purchasing history of customers, but keeping track of communication dynamics and social interactions.Employers have long used various tools to monitor workers, whether to track productivity or guard against improper behavior in the workplace. But as individuals communicate and socialize more and more online, a whole new array of data is becoming available to employers to evaluate job candidates and monitor workers through predictive analytics. Current U.S. privacy law provides almost no protection from the type of “profile” that can be generated through predictive analytics, no matter how personal. It considers any information that is potentially publicly available to not be private — regardless of how that “public” information is collected and used. There is, however, one developing privacy theory that could potentially provide privacy protection from predictive analytics: the “mosaic” theory recognizes that continuous monitoring of publicly available information can reveal an intimate picture of an individual’s life.Predictive analytics have existed for some time, but have only recently “come of age” in employment situations. This article examines the use of predictive analytics in the workplace, threats to worker privacy arising from predictive analytics, and whether the mosaic theory offers a viable and needed method of privacy protection. This article concludes, however, that unless a new theory of privacy protection is adopted — and soon — everyone faces serious threats to their privacy.
{"title":"Welcome to the Machine: Privacy and Workplace Implications of Predictive Analytics","authors":"R. Sprague","doi":"10.2139/SSRN.2454818","DOIUrl":"https://doi.org/10.2139/SSRN.2454818","url":null,"abstract":"Predictive analytics use a method known as data mining to identify trends, patterns, or relationships among data, which can then be used to develop a predictive model. Data mining itself relies upon big data, which is “big” not solely because of its size but also because its analytical potential is qualitatively different. “Big data” analysis allows organizations, including government and businesses, to combine diverse digital datasets and then use statistics and other data mining techniques to extract from them both hidden information and surprising correlations. These data are not necessarily tracking transactional records of atomized behavior, such as the purchasing history of customers, but keeping track of communication dynamics and social interactions.Employers have long used various tools to monitor workers, whether to track productivity or guard against improper behavior in the workplace. But as individuals communicate and socialize more and more online, a whole new array of data is becoming available to employers to evaluate job candidates and monitor workers through predictive analytics. Current U.S. privacy law provides almost no protection from the type of “profile” that can be generated through predictive analytics, no matter how personal. It considers any information that is potentially publicly available to not be private — regardless of how that “public” information is collected and used. There is, however, one developing privacy theory that could potentially provide privacy protection from predictive analytics: the “mosaic” theory recognizes that continuous monitoring of publicly available information can reveal an intimate picture of an individual’s life.Predictive analytics have existed for some time, but have only recently “come of age” in employment situations. This article examines the use of predictive analytics in the workplace, threats to worker privacy arising from predictive analytics, and whether the mosaic theory offers a viable and needed method of privacy protection. This article concludes, however, that unless a new theory of privacy protection is adopted — and soon — everyone faces serious threats to their privacy.","PeriodicalId":297424,"journal":{"name":"Richmond Journal of Law and Technology","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133092607","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
One of the most significant contemporary issues in privacy law relates to law enforcement’s new domestic surveillance tool: unmanned aerial vehicles, also known as, drones. Law enforcement’s use of aerial surveillance as an investigatory tool is currently under attack. In the past, if law enforcement chose to follow a suspect throughout the day, either on the ground or in the air, they need not worry about seeking a warrant or determining whether probable cause or reasonable suspicion exists to justify their surveillance. Aerial surveillance of criminal suspects has been considered outside the protections of Fourth Amendment law. In the 1980’s, the Supreme Court determined in cases such as California v. Ciraolo and Florida v. Royer that as long as the public had the same opportunities as law enforcement to view the ground below and the aircraft stayed within public navigable airspace, aerial surveillance was possible and not considered a Fourth Amendment “search.” Now, law enforcement aerial surveillance is being given a second look. Because of the advances in technology, law enforcement need not devote manpower or spend significant amounts of agency funds on costly airplane or helicopter rides to satisfy their surveillance needs. Law enforcement is slowly turning towards this new domestic surveillance tool as they begin to explore drone capabilities. Several state, local, and federal law enforcement agencies have admitted to utilizing unmanned aircraft systems as a domestic surveillance tool. The use of drones domestically is bound to increase significantly in the next several years as they prove to be cost-effective. Drones have already crept into our domestic, commercial lives. The Federal Aviation Authority is in the process of creating six test ranges and designating airspace to operate drone flights in order to develop better certification and air traffic standards. Public and private companies have already embraced drone technology as drones are being used for a variety of purposes, to include crop dusting, traffic monitoring, surveying land, and relaying communication signals. To date, eight states have passed legislation that significantly limits law enforcement’s use of drones, and twenty-two states have legislation pending on the matter. Congress is currently considering the Preserving American Privacy Act of 2013 which would require law enforcement to seek a warrant to use a drone to conduct surveillance on individuals or their property with specific exceptions given for emergencies. The question becomes whether the use of drones, as an aerial surveillance tool, triggers Fourth Amendment protections. This paper argues that drone use by law enforcement does not constitute a “search” under the Fourth Amendment, that the current legislative proposals clash with the Supreme Court’s current view on domestic aerial surveillance, and that law enforcement should seek a court order similar to the pen register statute under 18 U.S.C. § 2703 to
{"title":"Grounding Drones: Big Brother's Tool Box Needs Regulation Not Elimination","authors":"M. Reid","doi":"10.2139/SSRN.2357657","DOIUrl":"https://doi.org/10.2139/SSRN.2357657","url":null,"abstract":"One of the most significant contemporary issues in privacy law relates to law enforcement’s new domestic surveillance tool: unmanned aerial vehicles, also known as, drones. Law enforcement’s use of aerial surveillance as an investigatory tool is currently under attack. In the past, if law enforcement chose to follow a suspect throughout the day, either on the ground or in the air, they need not worry about seeking a warrant or determining whether probable cause or reasonable suspicion exists to justify their surveillance. Aerial surveillance of criminal suspects has been considered outside the protections of Fourth Amendment law. In the 1980’s, the Supreme Court determined in cases such as California v. Ciraolo and Florida v. Royer that as long as the public had the same opportunities as law enforcement to view the ground below and the aircraft stayed within public navigable airspace, aerial surveillance was possible and not considered a Fourth Amendment “search.” Now, law enforcement aerial surveillance is being given a second look. Because of the advances in technology, law enforcement need not devote manpower or spend significant amounts of agency funds on costly airplane or helicopter rides to satisfy their surveillance needs. Law enforcement is slowly turning towards this new domestic surveillance tool as they begin to explore drone capabilities. Several state, local, and federal law enforcement agencies have admitted to utilizing unmanned aircraft systems as a domestic surveillance tool. The use of drones domestically is bound to increase significantly in the next several years as they prove to be cost-effective. Drones have already crept into our domestic, commercial lives. The Federal Aviation Authority is in the process of creating six test ranges and designating airspace to operate drone flights in order to develop better certification and air traffic standards. Public and private companies have already embraced drone technology as drones are being used for a variety of purposes, to include crop dusting, traffic monitoring, surveying land, and relaying communication signals. To date, eight states have passed legislation that significantly limits law enforcement’s use of drones, and twenty-two states have legislation pending on the matter. Congress is currently considering the Preserving American Privacy Act of 2013 which would require law enforcement to seek a warrant to use a drone to conduct surveillance on individuals or their property with specific exceptions given for emergencies. The question becomes whether the use of drones, as an aerial surveillance tool, triggers Fourth Amendment protections. This paper argues that drone use by law enforcement does not constitute a “search” under the Fourth Amendment, that the current legislative proposals clash with the Supreme Court’s current view on domestic aerial surveillance, and that law enforcement should seek a court order similar to the pen register statute under 18 U.S.C. § 2703 to ","PeriodicalId":297424,"journal":{"name":"Richmond Journal of Law and Technology","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116519829","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article describes how digital age technologies and heightened judicial scrutiny are negatively affecting the attorney-client privilege claims of in-house counsel. The article spotlight the factors that are causing technology to negatively impact in-house privilege claims. This includes a discussion regarding the historical basis for increased court scrutiny of such claims. The article also examines how the indiscriminate use of email by corporations has invited the courts to view those claims with even greater skepticism. The article then explores how such skepticism bodes poorly for organizations as they attempt to protect their internal lawyers’ claims from the transparency that social networks, cloud computing, and BYOD provide into their discussions. Finally, the article offers some practical suggestions that can help mitigate the impact of these technological challenges.
{"title":"Inviting Scrutiny: How Technologies are Eroding the Attorney-Client Privilege","authors":"Philip J. Favro","doi":"10.2139/ssrn.2255206","DOIUrl":"https://doi.org/10.2139/ssrn.2255206","url":null,"abstract":"This article describes how digital age technologies and heightened judicial scrutiny are negatively affecting the attorney-client privilege claims of in-house counsel. The article spotlight the factors that are causing technology to negatively impact in-house privilege claims. This includes a discussion regarding the historical basis for increased court scrutiny of such claims. The article also examines how the indiscriminate use of email by corporations has invited the courts to view those claims with even greater skepticism. The article then explores how such skepticism bodes poorly for organizations as they attempt to protect their internal lawyers’ claims from the transparency that social networks, cloud computing, and BYOD provide into their discussions. Finally, the article offers some practical suggestions that can help mitigate the impact of these technological challenges.","PeriodicalId":297424,"journal":{"name":"Richmond Journal of Law and Technology","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125371748","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article explores the challenges of instilling a sense of academic integrity among a community of students who grew up in a cut and paste electronic environment. It advocates the adoption and use of a straight forward plagiarism definition without an intent element. Creating a clear understanding of what plagiarism is and how can it can be avoided is necessary to avoid the harms that result from incidents of plagiarism - both to individuals and to the academic community. To achieve this goal, the article proposes teaching ten rules for avoiding plagiarism in order to nurture a community of academic trust.
{"title":"Plagiarism in Cyberspace: Learning the Rules of Recycling Content with a View Towards Nurturing Academic Trust in an Electronic World","authors":"D. Gerhardt","doi":"10.2139/SSRN.1932386","DOIUrl":"https://doi.org/10.2139/SSRN.1932386","url":null,"abstract":"This article explores the challenges of instilling a sense of academic integrity among a community of students who grew up in a cut and paste electronic environment. It advocates the adoption and use of a straight forward plagiarism definition without an intent element. Creating a clear understanding of what plagiarism is and how can it can be avoided is necessary to avoid the harms that result from incidents of plagiarism - both to individuals and to the academic community. To achieve this goal, the article proposes teaching ten rules for avoiding plagiarism in order to nurture a community of academic trust.","PeriodicalId":297424,"journal":{"name":"Richmond Journal of Law and Technology","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130525702","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
and is a licensed attorney in Georgia. She has served as the Atlanta bureau chief for The Internet Law Journal and has worked as a research assistant for the Center for Social and Legal Research, a non-profit organization focused on privacy issues.
是乔治亚州的执业律师。她曾担任《互联网法律杂志》(the Internet Law Journal)亚特兰大分社社长,并曾在专注于隐私问题的非营利组织社会与法律研究中心(Center for Social and Legal research)担任研究助理。
{"title":"In Search of a Balance Between Police Power and Privacy in the Cybercrime Treaty","authors":"D. C. Kennedy","doi":"10.4324/9781315095493-9","DOIUrl":"https://doi.org/10.4324/9781315095493-9","url":null,"abstract":"and is a licensed attorney in Georgia. She has served as the Atlanta bureau chief for The Internet Law Journal and has worked as a research assistant for the Center for Social and Legal Research, a non-profit organization focused on privacy issues.","PeriodicalId":297424,"journal":{"name":"Richmond Journal of Law and Technology","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122830745","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}