40916
post-template-default,single,single-post,postid-40916,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,,qode_menu_,wpb-js-composer js-comp-ver-7.9,vc_responsive,elementor-default,elementor-kit-38031
Title Image

…Ready For It? Artificial Intelligence as a Political Weapon

…Ready For It? Artificial Intelligence as a Political Weapon

In August of 2024, Former President Donald J. Trump shared numerous images on his social media site, Truth Social, which seemed to show popstar Taylor Swift endorsing his campaign for the 2024 presidential election.[1] Among the photos is one showing Swift as Uncle Sam, accompanied by text reading “Taylor wants you to vote for Donald Trump,” and others showing Swift wearing a “Swifties for Trump” T-shirt.[2] Trump accompanied the photos with “I accept!” implying his approval for Swift’s alleged support.[3]

Online sleuths quickly uncovered that these photos were “deepfakes,” AI-generated images that had been convincingly altered and manipulated to misrepresent Swift’s political allegiance.[4] The images seemingly prompted Swift to publicly endorse Vice-President Kamala Harris on Instagram: “Recently I was made aware that AI of ‘me’ falsely endorsing Donald Trump’s presidential run was posted to his site. It really conjured up my fears around AI, and the dangers of spreading misinformation,” Swift wrote. “It brought me to the conclusion that I need to be very transparent about my actual plans for this election as a voter. The simplest way to combat misinformation is with the truth.”[5]

Swift’s experience is only one example of the dangers posed by both AI-generated misinformation and other deepfake images and videos. Concern built among AI researchers about the use of false images and videos as political weapons as early as 2020, when an edited video of House Speaker Nancy Pelosi with stunted and slurred speech got more than 3 million views.[6] AI is only serving to amplify the risk of attacks by allowing fake information to be created with “greater speed and sophistication” at lower costs.[7]

Increasingly sophisticated AI models have caused concern to rise in recent months, especially with the lack of legal protection candidates have against AI-generated misinformation. Neither federal election law or its regulations directly address the use of deepfakes as political weapons, and libel suits around political speech are rare and seldom successful.[8] While a majority of states have considered or passed measures intending to decrease the risk of harm caused by AI-generated misinformation, the states generally address the issue with disclaimer requirements rather than prohibitions.[9] Inadequate legislation shifts the burden onto election officials and voters themselves to combat the effects of misinformation on the 2024 election.[10]

State election officials in political battleground states say they are preparing for the potential threat posed by AI.[11] During a series of tabletop exercises, Arizona officials have spent the last six months confronting hypothetical scenarios involving disruptions on Election Day created or facilitated by AI.[12] The fictional cases included ones in which deepfake videos and voice-cloning technologies were deployed across social media in an attempt to dissuade people from voting, or confuse poll workers as they handle ballots.[13]

Lucas Hanson, co-founder of CivAI, a nonprofit group tracking the use of AI in politics, said the “primary targets” will be swing states, and, in particular, swing voters. [14]. “An even bigger [threat] potentially is trying to manipulate voter turnout, which in some ways is easier than trying to get people to actually change their mind,” he said. “Whether or not that shows up in this particular election it’s hard to know for sure, but the technology is there.”[15]

Footnotes[+]

Sophie Ashley

Sophie Ashley is a second-year J.D. candidate at Fordham University School of Law and a staff member of the Intellectual Property, Media & Entertainment Law Journal. She holds a B.A. from Vanderbilt University, with a major in English Literature and a minor in Musicology.