39322
post-template-default,single,single-post,postid-39322,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,smooth_scroll,,qode_menu_,wpb-js-composer js-comp-ver-7.9,vc_responsive,elementor-default,elementor-kit-38031
Title Image

ISIS and the “Free and Open” Internet

ISIS and the “Free and Open” Internet

On October 3, 2022, the Supreme Court agreed to hear Gonzalez v. Google LLC.[1] The case represents the first real consideration by the Court of the issues presented by the liability exceptions in Section 230 of the U.S. Code and the recommendation models of the world’s most popular social media platforms – particularly, YouTube, Twitter, and Facebook.

Section 230 provides exceptions to certain internet companies for liability associated with third-party content posted to their websites.  Specifically, Section 230 provides that no “provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”[2] Section 230 also provides a safe harbor provision for companies to engage in what might be considered moderation or publishing functions – for example, any action taken to restrict access to pornographic material.[3]

Section 230’s exceptions were designed to serve two functions. First, to preserve the dynamism of the internet by ensuring that internet content providers would not be litigated out of existence; and second, while the exceptions would secure their profit models, they would also encourage self-moderation of these platforms without the risk of parent companies being held liable as creators or editors or losing their status as neutral publishers.[4]

The 9th Circuit has developed caselaw on the meaning and applications of Section 230. In a pivotal case, Barnes v. Yahoo!., Inc., the 9th Circuit Court of Appeals explained that for a defendant to be entitled to immunity under Section 230, a court must ask whether the cause of action alleged by the plaintiff requires the presiding court to “treat the defendant as the ‘publisher or speaker’ of content provided by another.[5] A court must “ask whether the duty that the plaintiff alleges the defendant violated derives from the defendant’s status or conduct as a ‘publisher or speaker’—if it does, Section 230(c)(1) precludes liability.”[6]

Thus, cases involving liability under Section 230 have been determined based upon whether the underlying conduct, alleged by the plaintiff, which is the basis for his cause of action, is conduct which would normally be undertaken by or associated with a publisher. If so, Section 230 precludes liability, because it prevents interactive service providers from being treated as “publishers” of any content which has been provided by a third-party.

Since the implementation of Section 230 and the proliferation of social media companies, some of the largest and most successful of these companies have been sued by family members of victims of terrorist attacks under the Anti-Terrorism Act (“ATA”) for facilitating and providing assistance to the organizations that carried out those attacks.[7] Their claims arose out of the presence of these terrorist organizations on the social media platforms, and the extensive use of social media by these organizations, such as the Islamic State, for recruitment, propagandizing, and publicity.  In Force v. Facebook, the family of Taylor Force, an American killed in a terrorist attack in Israel, sued Facebook under the ATA, alleging that Facebook had provided assistance to Hamas in carrying out the attack.[8] Plaintiffs argued that their claims did not treat Facebook as the “publisher” of Hamas’s social media activity on the site – rather, their claims arose out of Facebook’s own conduct (what it did with Hamas’s content), activity which is not protected by Section 230. Facebook’s “targeted communication” and facilitation of “terror networking” make it more than a “mere publisher” – the Plaintiffs argue Facebook’s system “analyzes [a user’s] post and uses its proprietary algorithms to link the [user] with other Facebook users who are interested in similar topics…”[9]

A very similar set of facts underlies the case in Gonzalez v. Google.[10] The families of Nohemi Gonzalez, Nawras Alassaf, Sierra Clayborn, Tin Nguyen, and Nicholas Thalasinos sued Google, Facebook, and Twitter under the ATA, seeking to recover damages “for injuries suffered “by reason of an act of international terrorism.”[11] The victims were killed in separate acts of terrorism committed by the Islamic State (“ISIS”).[12] Under a similar theory of liability in Force v. Facebook, “plaintiffs allege that Google, Twitter, and Facebook are directly and secondarily liable for the five murders at issue in these cases… The complaints allege that defendants’ social media platforms allowed ISIS to post videos and other content to communicate the [group’s] message, to radicalize new recruits, and to generally further its mission.”[13]

The 9th Circuit ultimately ruled in favor of the defendant-appellees, rejecting the appellants’ argument that YouTube’s targeted recommendation mechanisms – steering users toward certain kinds of content based on their viewing history – and other so-called “matchmaking” functions made Google, the owner of YouTube, a “material contributor” to ISIS’s content, and therefore not a publisher, but a creator in part of the organization’s content.[14] Instead, the 9th Circuit pointed to its prior case law, emphasizing that a “website is not transformed into a content creator or developer by virtue of supplying ‘neutral tools’ that deliver content in response to user inputs.”[15]

In both Gonzalez and Force, spirited dissents accompanied the Circuit decisions, with Circuit Judge Gould, in his dissent in Gonzalez, appending the entirety of Second Circuit Judge Katzmann’s dissent in Force to his own opinion.[16] Both dissents feature powerful arguments for reconsidering their respective circuits’ position on Section 230 immunity for social media companies. In a particularly incisive comment, Judge Katzmann argues that “it strains the English language to say that in targeting and recommending these writers to users – and thereby forging connections, developing new social networks – Facebook is acting as ‘the publisher of…information provided by another information content provider.”’[17] It is precisely this conduct which so concerns Judge Gould in Gonzalez, and this conduct will become the focus of the question presented to the Supreme Court: “Does section 230(c)(1) immunize interaction computer services when they make targeted recommendations of information provided by another information content provider, or only limit the liability of interactive computer service when they engage in traditional editorial functions…?”[18]

Once the Supreme Court grants cert on a given case, there is a very high likelihood that the lower court’s decision will be reversed – since 2007, 71 percent of Supreme Court cases ended in the reversal of a lower court’s decision.[19] In the October 2021 term, the Court reversed 54 lower court decisions and affirmed only 12, for a reversal rate of 82%.[20] Furthermore, the Ninth Circuit has an abysmal reversal rate – in the October 2021 term, all cases accepted from that Circuit were reversed.[21] Given the rates of reversal, it is more than likely we will see some change in Section 230 doctrine – whether the decision ends in a narrow or broad disruption remains to be seen.

It should be noted that a narrowing of Section 230 would not necessarily make the existence of interactive service providers an impossibility, though it may substantially impact the current profit models of these companies. Section 230 would still, as it was meant to, protect the function of neutral publishers of internet content. What could change are the recommendation schemes, the content curation which has become a staple of Google, Facebook, Twitter, and YouTube, as Judge Gould emphasizes in his dissent in Gonzalez.[22] It would also impact the data collection schemes these companies utilize to make their targeted advertising more effective.  There is no reason why the internet, as it looks today, must be this way – there are no immutable laws, whether through the market or otherwise, which make this internet landscape the optimum.  Indeed, this landscape was molded and shaped by political choices, and thus can be molded and shaped in a different direction – perhaps as soon as this Supreme Court term.

Footnotes[+]

Aaron Bondar

Aaron Bondar is a second-year J.D. candidate at Fordham University School of Law. He graduated from Binghamton University with a B.A. in Economics. Aaron is a Staff Member on the Intellectual Property, Media, & Entertainment Law Journal.