27478
post-template-default,single,single-post,postid-27478,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,,qode_menu_,wpb-js-composer js-comp-ver-7.9,vc_responsive,elementor-default,elementor-kit-38031
Title Image

Unarmed Faceoff: Law Enforcement’s Use of Facial Recognition Technology and the Legislators Response

Unarmed Faceoff: Law Enforcement’s Use of Facial Recognition Technology and the Legislators Response

Technology permeates nearly every aspect of our lives, and facial recognition technology (“FRT”) has quickly threaded itself into the fabric of society. FRT is a software application that identifies individuals in photos or videos by comparing them to a database of preexisting images of faces.[1] Driver’s license photos, government identification records, mugshots, and pictures posted to social media accounts make up the database of “known” photographs.[2]

FRT makes life more convenient because it verifies identities quickly, automatically, and without contact.[3] For example, the existence of FRT allows iPhone users to unlock their phones without typing in a passcode, facilitates airlines in boarding planes by using images rather than boarding passes, and creates a quick check-out option for stores that have implemented FRT.[4]

FRT can also provide safety and security, as the technology is particularly useful to law enforcement.[5] Police have used it to identify criminals, find missing children, and close cases that have been open for years.[6] In addition, FRT’s ability to quickly identify individuals is particularly helpful in fighting petty crime in cities with high crime rates where police do not have the resources to investigate every claim.[7]

However, the accuracy of FRT proves problematic in solving criminal investigations.[8] Humans create the technology, and, thus, the software inevitably reflects human biases.[9] In addition, the data sets that FRT algorithms are trained on are not as robust for people of color and women, and as a result, such individuals are often misidentified.[10] A study at MIT found that FRT produces an error rate of 0.8% for light-skinned men and 34.7% for dark-skinned women.[11] Some studies have shown that humans are more accurate at identifying individuals than FRT.[12] This is problematic as it relates to intuition bias because officers are more likely to believe that the FRT accurately identified someone, even when their intuition tells them otherwise.[13] In addition, the technology has issues accurately identifying individuals if their appearance changes or if the angle of the photo uploaded is not straight-on.[14] The use of such technology has resulted in false arrests, and there is a growing concern that this will continue.[15]

There exists overwhelming concern regarding the lack of regulation governing how police utilize FRT. Without legislation, there is no policy or legal threshold that law enforcement must meet to perform a search using the technology.[16] Such unfettered searches raise Fourth Amendment concerns because, under the current state of the law, police can use FRT with no restraints to conduct searches in hopes of finding something to arrest for.[17] In addition, scholars argue that police use of FRT implicates First Amendment issues related to the right to privacy and the freedom to engage in anonymous speech and association.[18]

While the federal government has yet to enact legislation on this issue, cities such as San Francisco, California, and Portland, Oregon have taken an all-or-nothing approach, completely banning the use of FRT in criminal investigations.[19] Massachusetts became the first state to pass a law on facial recognition so law enforcement can continue to access and use the software in a responsible manner.[20] The law goes into effect in July 2021 and attempts to strike a balance between curbing the harms from such technology and allowing law enforcement to harness the benefits of the tool.[21] Except for emergency situations, the Massachusetts bill requires police to obtain a court order from a judge before running a facial recognition search and requires someone from the state police, FBI, or Registry of Motor Vehicles to perform the search.[22] Now, officers cannot merely download a facial recognition app on their phones and run a search at their discretion.[23] In addition, the legislation establishes a commission to study facial recognition policies and make recommendations as to whether criminal defendants should be told if they were identified using FRT.[24]

There will always exist tension between the need for information and the right to privacy when it comes to technology. It is up to the courts and legislators to be aware of the benefits and harms that advancing technology offers and respond accordingly.

Footnotes[+]

Laura Rann

Laura Rann is a second-year J.D. candidate at Fordham University and a staff member of the Intellectual Property, Media & Entertainment Law Journal. She is also a member of the Dispute Resolution Society, symposium coordinator of the Media & Entertainment Law Society, a 1L Student Advisor, and a Lexis Student Representative. She holds a B.A. in Music from the University of Georgia.