The full text of this Article may be found here.
32 Fordham Intell. Prop. Media & Ent. L.J. 922 (2022).
Note by Luke Stark* & Jevan Hutson **
[T]
he reanimation of the pseudosciences of physiognomy and phrenology at scale through computer vision and machine learning is a matter of urgent concern. This Article—which contributes to critical data studies, consumer protection law, biometric privacy law, and antidiscrimination law—endeavors to conceptualize and problematize physiognomic artificial intelligence (“AI”) and offer policy recommendations for state and federal lawmakers to forestall its proliferation.
Physiognomic AI, as this Article contends, is the practice of using computer software and related systems to infer or create hierarchies of an individual’s body composition, protected class status, perceived character, capabilities, and future social outcomes based on their physical or behavioral characteristics. Physiognomic and phrenological logics are intrinsic to the technical mechanism of computer vision applied to humans. This Article observes how computer vision is a central vector for physiognomic AI technologies and unpacks how computer vision reanimates physiognomy in conception, form, and practice and the dangers this trend presents for civil liberties.
This Article thus argues for legislative action to forestall and roll back the proliferation of physiognomic AI. To that end, it considers a potential menu of safeguards and limitations to significantly limit the deployment of physiognomic AI systems, which hopefully can be used to strengthen local, state, and federal legislation. This Article foregrounds its policy discussion by proposing the abolition of physiognomic AI. From there, it posits regimes of U.S. consumer protection law, biometric privacy law, and civil rights law as vehicles for rejecting physiognomy’s digital renaissance in AI. Specifically, it contends that physiognomic AI should be categorically rejected as oppressive and unjust. Second, it argues that lawmakers should declare physiognomic AI unfair and deceptive per se. Third, it proposes that lawmakers should enact or expand biometric privacy laws to prohibit physiognomic AI. Fourth, it recommends that lawmakers should prohibit physiognomic AI in places of public accommodation. It also observes the paucity of procedural and managerial regimes of fairness, accountability, and transparency in ad- dressing physiognomic AI and attend to potential counterarguments in support of physiognomic AI.
*Assistant Professor, Faculty of Information and Media Studies, Western University; Affiliate, Center for Law, Innovation, and Creativity, Northeastern University School of Law.
**Independent scholar and technology policy advocate. University of Washington School of Law, J.D. 2020. Formerly Lead Policy Advocate for Facial Recognition & AI at the University of Washington School of Law’s Technology Law and Public Policy Clinic. We are grateful for invaluable input from participants of the workshop “The Return of Anthropometry: Digital Positivism and the Body Politic,” held virtually at The Centre for Space, Place and Society at Wageningen University, Wageningen, NL, August 27, 2020, and of the Northeast Privacy Scholars Workshop, held virtually at the Center for Law, Innovation, and Creativity (CLIC) at Northeastern University School of Law, November 13, 2020, with special thanks to Claudia Haupt for leading our session. We’re also indebted to Ryan Calo, Leif Hancox-Li, Woodrow Hartzog, Morgan Klaus Scheuerman, Os Keyes, and Joseph Fridman for their support, insights, and guidance.