41024
post-template-default,single,single-post,postid-41024,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,,qode_menu_,wpb-js-composer js-comp-ver-7.9,vc_responsive,elementor-default,elementor-kit-38031
Title Image

Online Safety vs. Free Expression: The Legal Tensions Surrounding the Kids Online Safety Act

Online Safety vs. Free Expression: The Legal Tensions Surrounding the Kids Online Safety Act

In July, the Senate passed the Kids Online Safety Act (KOSA), introducing a federal duty of care for online platforms likely used by minors under 17.[1] This Act requires the covered platforms to “exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate” a set of “harms to minors.”[2] This set of harm includes mental health issues (such as anxiety and depression), addiction-like behaviors, physical violence, bullying, sexual exploitation, promotion of harmful substances or activities (such as drugs and gambling), and predatory or deceptive marketing practices.[3] The last federal legislation aimed at protecting children online was passed in 1998.[4] Supporters argue that KOSA will offer greater privacy protections and safeguard against harmful mental effects on young people caused by various online platforms.[5] Some advocacy groups, however, warn that KOSA could face constitutional challenges, suggesting the duty of care may lead to censorship.[6]

Proponents of KOSA contend that there is an urgent need to protect minors from harmful design and business practices that worsen youth mental health issue.[7] Concerns about social media’s impact on young people are growing; for instance, in 2023, the U.S. Surgeon General issued a public health advisory on social media and youth mental health.[8] While he noted both positive and negative effects of social media use, he emphasized the lack of sufficient evidence on its safety for minors and urged policymakers and tech companies to make social media safer.[9] Similarly, a growing number of lawsuits allege that social media platforms use designs that lead young users toward harmful content.[10] Most recently, a coalition of fourteen state attorneys general sued TikTok, its parent company, ByteDance, and affiliates in their respective state courts, alleging that TikTok created addictive features to maximize profit at the expense of the mental health of its minor users.[11]

These concerns set the stage for the introduction of KOSA. On his website, Senator Richard Blumenthal (D-Conn.), who introduced the bill with Senator Marsha Blackburn (R-Tenn.), states, “When your child is online, they are the product, and Big Tech is trying every method possible to keep them scrolling, clicking ads, and sharing every detail of their life.”[12] The website further explains that KOSA would require the covered platforms to engage in safer design processes and prioritize minors’ well-being by creating a “safe by default” online environment.[13]

While there is general agreement on the need to strengthen online protections for minors, some advocacy groups contend that the duty of care provision could face constitutional challenges.[14] Some warn that this duty could lead to censorship because it will require the covered platforms to determine what content might be “harmful” to minors in order to mitigate harm in their design features.[15] Others also warn that the duty provision may lead to censorship because automated tools will not be able to accurately identify content that may cause harm under KOSA, therefore leading to over-censorship.[16] Censorship in turn may trigger First Amendment scrutiny.[17]

In Moody v. NetChoice, the Supreme Court held that content moderation is an editorial judgment protected by the First Amendment.[18] In NetChoice v. Bonta, the Ninth Circuit found that the Data Protection Impact Assessment (DPIA) likely violates the First Amendment because it requires companies to submit reports to the attorney general addressing design features that may harm children and plans to mitigate these harms, which may lead to censorship.[19] The DPIA report requires the covered online businesses to identify any risk of “material detriment to children” arising from their data practices for each service likely accessed by minors under 18.[20]

Accordingly, critics argue that KOSA’s duty of care provision also requires companies to develop opinions about what may be harmful to minors, drawing similarities with the DPIA report.[21] Given that the duty of care provision affects content moderation, Moody suggests that the duty provision will likely face First Amendment scrutiny, requiring the government to prove its compelling interest and that this provision is narrowly-tailored to advance such interest.[22] Critics warn that due to its similarities with the DPIA report, KOSA’s duty provision may not survive such scrutiny.[23]

Additionally, they caution that these platforms may be incentivized to censor content they believe the government views as harmful to minors in order to minimize legal risk.[24] This could lead to suppressing content that is politically controversial, but nonetheless important for minors’ safety and health, such as information on LGBTQ+ issues or reproductive rights.[25]

A censorship-based approach, particularly if platforms are vulnerable to political influence as some advocacy groups warn, could conflict with a core principle of the United Nations Convention on the Rights of the Child (“UNCRC”): that children are active rights holders, not passive subjects of the law.[26] Notably, the United States is the only member state that has not ratified the UNCRC. The UNCRC defines a child as an individual under the age of 18 or the age in which majority is attained under the laws of the applicable jurisdiction.[27] Article 12 of the Convention affirms children’s rights to hold their own opinions and to have those opinions heard and taken seriously.[28] Article 13 further affirms children’s rights to seek and receive information.[29] As a children’s right and law professor, Laura Lundy notes, these rights are “not dependent upon their capacity to express a mature view”.[30] If minors’ access to content are restricted based on political pressures or companies’ desire to avoid legal risk, rather than their best interests, it could undermine KOSA’s goal of safeguarding young people online. Such a censorship-based approach risks shifting the focus from children’s right to safety to using their right as a proxy for political battles, potentially causing more harm.

In light of rising concerns over youth mental health and online safety, KOSA aims to hold platforms accountable for creating a safer digital environment for minors. However, critics argue that its duty of care provision may infringe on First Amendment rights and risk over-censorship, potentially limiting minors’ access to essential information on sensitive topics. As debates continue, balancing the protection of children’s rights to safety, access to information, and freedom of expression remains crucial to crafting effective, constitutionally sound legislation.

Footnotes[+]

Gia Kim

Gia Kim is a second-year J.D. candidate at Fordham University School of Law and a staff member of the Intellectual Property, Media & Entertainment Law Journal. She holds a B.A. in Computer Science-Mathematics from Columbia University.