Our legal analysis of the Kids Online Safety Act

by Haley Hinkle, Fairplay Policy Counsel

Fairplay has been a vocal champion of the Kids Online Safety Act (KOSA) because of the many ways the bill will create a safer, less manipulative internet for kids and teens. Critics have highlighted two primary concerns with the bill: That it enables widespread censorship, and that it will lead to age verification and identification checks across the internet. Today, we wanted to share our analysis of these issues and explain why we believe both outcomes are highly unlikely.

Censorship

Critics of KOSA have argued that the bill’s Duty of Care will lead to widespread content censorship online, including censorship of content from and about the LGBTQ+ community. These critics contend that in order to comply with KOSA and avoid lawsuits from state attorneys general, companies will remove entire categories of content from their platforms. One critic has gone so far as to claim, “If KOSA passes, all the benefits of accurate health information, social networking, community, and access to help in a questioning or crisis situation will be eliminated for all LGBTQ youth.”1

Contrary to such hyperbolic claims, the text of the bill does not create a basis for content censorship, as we have previously outlined.2 The Duty of Care in Section 3 requires covered platforms to examine the impact of the design and operation of any product, service, or feature on specific, enumerated harms. Covered platforms are required to take reasonable measures to prevent and mitigate those harms. The listed harms are clearly defined and connected to the demonstrated harms children and teens face online every day. Section 3 also includes a limiting principle that says nothing in the Duty of Care shall be construed to prevent “any minor from deliberately and independently searching for, or specifically requesting, content.” The bill text is clear: a platform’s obligations under the Duty of Care are not about any individual piece of content’s existence or removal.

Critics have not provided a basis for their assertion that platforms will comply with KOSA’s Duty of Care requirements by engaging in widespread censorship. There are countless examples of design choices and product offerings that would be subject to scrutiny under KOSA. Products such as Snapchat and Meta’s AI chatbots, which target young users in order to maximize engagement,3 and the expansion of Meta’s Horizon Worlds VR to teenagers, and now, preteens4 would have to be assessed for their potential impact on kids and teens before they are launched and harm young users. Further, Big Tech would be required to assess the impact of design features such as endless scroll and autoplay as well as content recommendation algorithms and to mitigate the impact of those features on the enumerated harms.5

Should an attorney general ignore the text and purpose of the bill and attempt to censor categories of information using KOSA anyway, such a claim would face substantial opposition in court. Critics contend that a state attorney general could argue, for example, that LGBTQ+ content is associated with anxiety or depression in violation of the Duty of Care, and should therefore be removed from a covered platform entirely. First, the overwhelming weight of scientific research demonstrates a positive association between access to gender- and identity-affirming healthcare and LGBTQ+ minors’ mental health.6 Second, court precedent demonstrates that such a claim would not survive under the First Amendment and Section 230 of the Communications Decency Act. Federal courts consistently subject both child safety and technology laws and regulations to rigorous scrutiny under the First Amendment. See, e.g., Junior Sports Magazines, Inc. et al. v. Bonta, No. 22-56090 (9th Cir. Sept. 13, 2023); NetChoice, LLC v. Griffin, Case No. 5:23-cv-05105 (W.D. Ark. Aug. 31, 2023); Free Speech Coalition, Inc., et al. v. Colmenero, Case No. 1:23-cv-917 (W.D. Tex. Aug. 31, 2023). In addition, under the plain text of Section 230 of the Communications Decency Act, a company cannot be held liable as the publisher of user-generated content published on its platforms by a third party. 47 U.S.C. § 230(c).

Let us be clear: The First Amendment and Section 230 do not provide blanket protections for online platforms’ design choices, nor their collection, retention, and use of user data. We have argued – and will continue to argue – that there is a distinction between corporate liability for the existence of a piece of user-generated content, and liability for the data practices, design choices, and operations that affect young users’ online experiences. See Brief for Fairplay et al. as Amici Curiae in Support of the Defendant, NetChoice, LLC v. Bonta, 5:22-cv-08861 (2023); Brief for Fairplay as Amicus Curiae in Support of the Petitioner, Gonzalez v. Google, 598 US __ (2023). We strongly believe online platforms should be held responsible for the impact their business decisions have on young users, and KOSA is a reflection of that approach. But the First Amendment and Section 230 do provide critical free speech protections online, including protections against officials who may attempt to censor content or silence a community.

Age Verification

Critics have also argued that KOSA will result in widespread age-gating of the internet and will lead to new privacy-invasive verification measures, such as providing government IDs in order to access social media. In reality, KOSA does not impose any age verification requirements on covered platforms. The bill sets forth requirements for an age verification study and report in Section 9, but that section does not contain any requirements for implementation. In fact, KOSA explicitly says in a Rule of Construction that age gating or collecting additional information to verify age is not required. Section 15 of KOSA says:

    (c) Protections for Privacy.—Nothing in this Act shall be construed to require—
    (1) the affirmative collection of any personal data with respect to the age of users that a covered platform is not already collecting in the normal course of business; or
    (2) a covered platform to implement an age gating or age verification functionality.

Despite this explicit language, some KOSA critics claim that platforms will have to collect more data in order to determine which users are minors and therefore deserve additional protections. The Center for Democracy and Technology sums up the concern:

    Compared to the version of the bill debated last Congress, new language in Sec. 14(b)(2) of KOSA expressly notes that nothing in the bill should be interpreted as mandating age verification functionality. But this is essentially meaningless if the very nature of the bill requires online services to treat minors differently from adult users. Doing so would require online services to know the ages of their users, adults and children alike.

It is true that providing additional protections for minors requires distinguishing those minors from adults, but KOSA does not require platforms to establish with certainty who is and is not a minor. The bill does not contain a strict liability approach or any other standard that would require platforms to seek users’ ages. Instead, KOSA holds platforms responsible for users it knows are minors, and it defines “knows” as “hav[ing] actual knowledge or knowledge fairly implied on the basis of objective circumstances.” This knowledge standard is substantially similar to the knowledge standard proposed in 2022 in the American Data Privacy and Protection Act (ADPPA),8 a comprehensive privacy bill endorsed by Fairplay that treats adults and minors differently, and that many of KOSA’s critics (including the Center for Democracy and Technology and Fight for the Future) cite as their preferred policy approach. Both KOSA and ADPPA rightly recognize that social media platforms have more than enough data already about their users to make reasonable inferences about age. In fact, these platforms already do make these inferences in order to target content to users.9

Further, ADPPA and KOSA have near-identical rules of construction regarding the age of users and data collection. Section 205 of ADPPA says that the bill’s knowledge requirements

    shall not be construed to require the affirmative collection or processing of any data with respect to the age of an individual or a proxy thereof, or to require that a covered entity implement an age gating regime. Rather, the determination of whether an individual is under 17 shall be based on the covered data collected directly from an individual or a proxy thereof that the covered entity would otherwise collect in the normal course of business.

In short, even KOSA’s critics acknowledge that it is possible to provide additional protections for kids without resorting to privacy-invasive age-gating when evaluating bills they support. The Kids Online Safety Act will create critical new protections for young people without requiring the collection of any additional information in order to verify age.

Conclusion

The text of KOSA – and in particular, its limiting provisions – clearly demonstrates the purpose of the legislation: Holding online platforms accountable for the business and design decisions they make every day to put profit over kids’ and teens’ wellbeing. That is why we stand by this text and why we continue to urge the Senate to vote to pass KOSA and usher in a safer, healthier digital world for young people.

  1. Fight for the Future, Letter: Civil rights groups reaffirm opposition to KOSA, emphasize continued threat to LGBTQ youth (June 29, 2023), https://www.fightforthefuture.org/news/2023-06-29-letter-civil-rights-groups-reaffirm-opposition-to-kosa-emphasize-continued-threat-to-lgbtq-youth.
  2. American Psychological Association, Center for Digital Democracy & Fairplay, et al., The Kids Online Safety Act: Protecting LGBTQ+ Children & Adolescents Online, https://fairplayforkids.org/wp-content/uploads/2023/08/LGBTQ-Youth-KOSA-Fact-Sheet.pdf.
  3.  Salvador Rodriguez, Deepa Seetharaman & Aaron Tilley, Meta to Push for Younger Users With New AI Chatbot Characters, Wall Street Journal (Sept. 24, 2023), https://www.wsj.com/tech/ai/meta-ai-chatbot-younger-users-dab6cb32?mod=rss_Technology; Geoffrey A. Fowler, Snapchat tried to make a safe AI. It chats with me about booze and sex., Washington Post (Mar. 14, 2023)
    https://www.washingtonpost.com/technology/2023/03/14/snapchat-myai/.
  4. Meta, Introducing New Parent-Managed Meta Accounts for Families (June 16, 2023, updated Sept. 21, 2023) https://www.meta.com/blog/quest/meta-accounts-parent-managed-families/; Letter from Center for Digital Democracy, Center for Countering Digital Hate & Fairplay et al. to Mark Zuckerberg re: Horizon Worlds (April 14, 2023), https://fairplayforkids.org/wp-content/uploads/2023/04/HorizonLetter.pdf.
  5. Julie Jargon, TikTok Feeds Teens a Diet of Darkness, Wall Street Journal (May 13, 2023) https://www.wsj.com/articles/tiktok-feeds-teens-a-diet-of-darkness-8f350507; Jeff Horwitz & Katherine Blunt, Instagram Connects Vast Pedophile Network, Wall Street Journal (June 7, 2023), https://www.wsj.com/articles/instagram-vast-pedophile-network-4ab7189?st.
  6. See, e.g., Diane Chen, Johnny Berona & Yee-Ming Chan et al., Psychosocial Functioning in Transgender Youth after 2 Years of Hormones, 388 New England Journal of Medicine 240 (Jan. 19, 2023), https://doi.org/10.1056/NEJMoa2206297; Amy E. Green, Jonah P. DeChants & Myeshia N. Price, et al., Association of Gender-Affirming Hormone Therapy With Depression, Thoughts of Suicide, and Attempted Suicide Among Transgender and Nonbinary Youth, 70 Journal of Adolescent Health 643 (April 2022), https://doi.org/10.1016/j.jadohealth.2021.10.036; Diana M. Tordoff, Jonathon W. Wanta & Arin Collin, et al., Mental Health Outcomes in Transgender and Nonbinary Youths Receiving Gender-Affirming Care, 5 JAMA Network Open (Feb. 25, 2022),
    https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2789423; Jack L. Turban, Dana King & Julia Kobe, Access to gender-affirming hormones during adolescence and mental health outcomes among transgender adults, 17 PLOS ONE (Jan. 12, 2022), https://doi.org/10.1371/journal.pone.0261039; see also Office of Population Affairs, Gender-Affirming Care and Young People, Dep’t of Health & Human Services, https://opa.hhs.gov/sites/default/files/2023-08/gender-affirming-care-young-people.pdf.
  7. Aliyah Bhatia, Senate Commerce Should Reject Bills Jeopardizing Online Safety for Kids and Adults, Center for Democracy & Technology (July 25, 2023), https://cdt.org/insights/senate-commerce-should-reject-bills-jeopardizing-online-safety-for-kids-and-adults.
  8. ADPPA creates a tiered knowledge standard system based on the size of the platform. Under the bill, the largest covered platforms have “knowledge” that a user is a minor if “the entity knew or should have known the individual was a covered minor.” Medium-sized covered platforms, have “knowledge” a user is a minor if the platform “knew or acted in willful disregard of the fact that the individual was a covered minor.” ADPPA holds all other covered platforms to an actual knowledge standard.
  9. Raymond Zhong & Sheera Frenkel, A Third of TikTok’s U.S. Users May Be 14 or Under, Raising Safety Questions, N.Y. Times (Aug. 14, 2020), https://www.nytimes.com/2020/08/14/technology/tiktok-underage-users-ftc.html.
Haley2

Haley Hinkle is Fairplay’s Policy Counsel. In her role, Haley focuses on Fairplay’s work advocating for laws and regulations that protect children and teens’ autonomy and safety online. Before joining Fairplay, Haley clerked for the Hon. Robert L. Miller, Jr. in the U.S. District Court for the Northern District of Indiana. During law school, Haley worked on issues at the intersection of government surveillance technology and civil liberties. Haley studied law at Indiana University – Bloomington and journalism and political science at Northwestern University.