New Regulations On AI
The Federal Trade Commission has recently adopted a new rule designed to fight back against misleading practices in online reviews and testimonials.[1] This rule covers six categories of conduct the FTC considers to be deceptive.[2] One part of the rule prohibits commercial advertisers from exaggerating their influence by paying for bots to inflate follower counts.[3] This section addresses fake social media indicators, such as followers or views artificially generated through bots or hijacked accounts.[4] The rule prohibits businesses from inflating their social media influence and deceiving consumers about their popularity or credibility for a marketing purpose.[5] Specifically, the rule bans the purchase or sale of such indicators when the purchaser is aware or should have been aware that they are fake.[6]
Speaking on this new rule, FTC Chair Lina Khan said: “Fake reviews not only waste people’s time and money, but also pollute the marketplace and divert business away from honest competitors. By strengthening the FTC’s toolkit to fight deceptive advertising, the final rule will protect Americans from getting cheated, put businesses that unlawfully game the system on notice, and promote markets that are fair, honest, and competitive.”[7] This new rule was announced as final on August 14, 2024.[8]
Meanwhile, the Federal Communications Commission has been considering adopting new rules on the use of AI in political advertising.[9] The agency recently advanced a proposal that would require political advertisers to disclose their use of artificial intelligence in broadcast television and radio ads. [10] There’s a concern about AI being used to mislead voters: “There’s too much potential for AI to manipulate voices and images in political advertising to do nothing,” the agency’s chairwoman, Democrat Jessica Rosenworcel, said. “If a candidate or issue campaign used AI to create an ad, the public has a right to know.”[11] AI has already started being used in political ads in the United States and indeed throughout the world.[12] While some political parties and candidates have disclosed their use of these tools, others have used this technology to mislead voters.[13] For example, earlier this year, a political consultant used AI-generated robocalls to imitate President Joe Biden’s voice, falsely telling voters in New Hampshire ahead of the state’s primary that voting in that primary would preclude them from voting in November.[14] This consultant ultimately faced a $6 million fine and more than two dozen criminal charges.[15]
To further the goal of greater disclosure, the new FCC proposal would force broadcasters to ask political advertisers whether their material was made using AI tools, such as text-to-image creators or voice-cloning software.[16] It would also require broadcasters to make a live announcement to viewers that AI-generated content is being used in an ad and to include a notice revealing their use of AI in their online political files.[17] However, the agency would not have control over digital or streaming platforms, leaving these unregulated at the federal level.[18]
However, these proposed rules have encountered some opposition along partisan lines. Republican Sean Cooksey, the chairman of the Federal Election Commission, has argued that this regulation falls within the scope of the FEC’s work and would be a usurpation of their authority.[19] After these proposed rules were announced, he wrote to the Democratic Chairwoman of the FCC, “I am concerned that parts of your proposal would fall within the exclusive jurisdiction” of the FEC and would “directly conflict with existing law and regulations, and sow chaos among political campaigns for the upcoming election.”[20]
Perhaps related to this partisan divide on the issue, Congress, whose political control is currently split between the House and Senate, has not passed legislation telling agencies how to regulate the use of AI in politics.[21] Still, more than one-third of states have in fact passed regulations on the use of AI in campaigns and elections.[22]
Footnotes