{"title":"Partisan Gerrymandering and the Constitutionalization of Statistics","authors":"Jacob Eisler","doi":"10.2139/SSRN.3145191","DOIUrl":null,"url":null,"abstract":"Data analysis has transformed the legal academy and is now poised to do the same to constitutional law. In the latest round of partisan gerrymandering litigation, lower courts have used quantitative tests to define rights violations and strike down legislative districtings across the country. The Supreme Court’s most recent opinion on partisan gerrymandering, Gill v. Whitford, hinted that quantitative tests may yet define the constitutionality of partisan gerrymandering. Statistical thresholds thus could be enshrined as constitutional protections and courts recast as agents of discretionary policy.\r\n\r\nThis Article describes how excessive dependence on metrics transforms judicial decision-making and undermines rights enforcement. Courts enforce constitutional law to ensure governmental compliance with rights, not to advance alternative policy arrangements. Yet the core of rights is moral principle, not descriptive conditions in the world. If quantitative outcomes are used to define rights, the moral character of judicial rights enforcement is undermined, and courts act as quasi-regulatory entities that compete with democratically elected branches. Arguably the most condemned decision of the twentieth century, Lochner, reflected such a quasi-regulatory approach to rights enforcement; excessive reliance on statistics threatens to repeat that mistake.\r\n\r\nThe law of partisan gerrymandering needs a new principle, not new metrics. The best principle to identify partisan gerrymandering is the right to fair representation, which is violated when legislatures seize partisan advantage in democratic process. Quantitative analysis should have the sole function of proving that alleged partisan gerrymanders seek such advantage.\r\n\r\nThis Article thus identifies a novel and troubling trend in constitutional law and describes how it dominates a topic of immediate practical importance. It then offers a general framework for conceptualizing rights protection and applies it to this pressing doctrinal issue.","PeriodicalId":81162,"journal":{"name":"Emory law journal","volume":"68 1","pages":"979"},"PeriodicalIF":0.0000,"publicationDate":"2018-03-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Emory law journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/SSRN.3145191","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Data analysis has transformed the legal academy and is now poised to do the same to constitutional law. In the latest round of partisan gerrymandering litigation, lower courts have used quantitative tests to define rights violations and strike down legislative districtings across the country. The Supreme Court’s most recent opinion on partisan gerrymandering, Gill v. Whitford, hinted that quantitative tests may yet define the constitutionality of partisan gerrymandering. Statistical thresholds thus could be enshrined as constitutional protections and courts recast as agents of discretionary policy.
This Article describes how excessive dependence on metrics transforms judicial decision-making and undermines rights enforcement. Courts enforce constitutional law to ensure governmental compliance with rights, not to advance alternative policy arrangements. Yet the core of rights is moral principle, not descriptive conditions in the world. If quantitative outcomes are used to define rights, the moral character of judicial rights enforcement is undermined, and courts act as quasi-regulatory entities that compete with democratically elected branches. Arguably the most condemned decision of the twentieth century, Lochner, reflected such a quasi-regulatory approach to rights enforcement; excessive reliance on statistics threatens to repeat that mistake.
The law of partisan gerrymandering needs a new principle, not new metrics. The best principle to identify partisan gerrymandering is the right to fair representation, which is violated when legislatures seize partisan advantage in democratic process. Quantitative analysis should have the sole function of proving that alleged partisan gerrymanders seek such advantage.
This Article thus identifies a novel and troubling trend in constitutional law and describes how it dominates a topic of immediate practical importance. It then offers a general framework for conceptualizing rights protection and applies it to this pressing doctrinal issue.
数据分析已经改变了法律学院,现在正准备对宪法做同样的事情。在最新一轮党派不公正划分选区的诉讼中,下级法院使用定量测试来界定侵犯权利的行为,并在全国范围内取消立法选区。最高法院在吉尔诉惠特福德案(Gill v. Whitford)中关于党派不公正划分选区的最新意见暗示,定量测试可能还没有确定党派不公正划分选区是否符合宪法。因此,统计阈值可以被奉为宪法保护,法院可以被重塑为自由裁量政策的代理人。本文描述了过度依赖指标如何改变司法决策并破坏权利执行。法院执行宪法法律是为了确保政府遵守权利,而不是推进替代政策安排。然而,权利的核心是道德原则,而不是世界上的描述性条件。如果用量化结果来定义权利,司法权利执行的道德品质就会受到损害,法院就会成为与民主选举部门竞争的准监管实体。可以说,20世纪最受谴责的判决洛克纳案(Lochner)反映了这种准监管的权利执行方式;过度依赖统计数据可能会重蹈覆辙。党派不公正划分选区的法律需要一个新的原则,而不是新的衡量标准。确定党派不公正划分的最佳原则是公平代表权,立法机关在民主过程中抓住党派优势,就违反了公平代表权。定量分析的唯一作用应该是证明所谓的党派不公正划分选区者寻求这种优势。因此,本文确定了宪法中一个新颖而令人不安的趋势,并描述了它如何主导一个具有直接实际重要性的主题。然后,它提供了一个概念化权利保护的总体框架,并将其应用于这一紧迫的理论问题。