November 17, 2022. Advocates to FTC: Write rules to protect kids from harmful manipulative design online

Contact:

David Monahan, Fairplay: [email protected]
Jeff Chester, Center for Digital Democracy: [email protected]

Advocates to FTC: Write rules to protect kids from harmful manipulative design online

At every turn, young people face tricks and traps to keep them online for hours and sharing sensitive data

BOSTON, MA and WASHINGTON, DC – November 17, 2022 – A coalition of leading health and privacy advocates filed a petition today asking the Federal Trade Commission to promulgate a rule prohibiting online platforms from using unfair design features to manipulate children and teens into spending excessive time online. Twenty-one groups, led by Fairplay and the Center for Digital Democracy, said in their petition: “When minors go online, they are bombarded by widespread design features that have been carefully crafted and refined for the purpose of maximizing the time users spend online and activities users engage in.” They urged the FTC to establish rules of the road to establish when these practices cross the line into unlawful unfairness.

The advocates’ petition details how the vast majority of apps, games, and services popular among minors generate revenue primarily via advertising, and many employ sophisticated techniques to cultivate lucrative long term relationships between minors and their brands. As a result, platforms use techniques like autoplay, endless scroll, and strategically timed advertisements to keep kids and teens online as much as possible– which is not in their best interests.

The petition also details how manipulative design features on platforms like TikTok, Twitter, YouTube, Facebook, Instagram, and Snapchat undermine young people’s wellbeing. Excessive time online displaces sleep and physical activity, harming minors’ physical and mental health, growth, and academic performance. Features designed to maximize engagement also expose minors to potential predators and online bullies and age-inappropriate content, harm minors’ self-esteem, and aggravate risks of disordered eating and suicidality. The manipulative tactics also undermine children’s and teens’ privacy by encouraging the disclosure of massive amounts of sensitive user data.

The advocates’ petition comes just months after California passed its Age Appropriate Design Code, a law requiring digital platforms to act in the best interests of children, and as momentum grows in Congress for the Kids and Online Safety Act and the Children and Teens’ Online Privacy Protection Act. 

The petition was drafted by the Communications and Technology Law Clinic at Georgetown University Law Center.  

Haley Hinkle, Policy Counsel, Fairplay:

“The manipulative tactics described in this Petition that are deployed by social media platforms and apps popular with kids and teens are not only harmful to young people’s development– they’re unlawful. The FTC should exercise its authority to prohibit these unfair practices and send Big Tech a message that manipulating minors into handing over their time and data is not acceptable.”

Katharina Kopp, Deputy Director, Center for Digital Democracy:

“The hyper-personalized, data-driven advertising business model has hijacked our children’s lives. The design features of social media and games have been purposefully engineered to keep young people online longer and satisfy advertisers. It’s time for the FTC to put an end to these unfair and harmful practices. They should adopt safeguards that ensure platforms and publishers design their online content so that it places the well-being of young people ahead of the interests of marketers.”

Jenny Radesky, MD, Associate Professor of Pediatrics, University of Michigan and Chair-elect, American Academy of Pediatrics Council on Communications and Media:

“As a pediatrician, helping parents and teens navigate the increasingly complex digital landscape in a healthy way has become a core aspect of my work. If the digital environment is designed in a way that supports children’s healthy relationships with media, then it will be much easier for families to create boundaries that support children’s sleep, friendships, and safe exploration. However, this petition highlights how many platforms and games are designed in ways that actually do the opposite: they encourage prolonged time on devices, more social comparisons, and more monetization of attention. Kids and teens are telling us that these types of designs actually make their experiences with platforms and apps worse, not better. So we are asking federal regulators to help put safeguards in place to protect against the manipulation of children’s behavior and to instead prioritize their developmental needs.”

Professor Laura Moy, Director, Communications & Technology Law Clinic at Georgetown Law, and counsel for Center for Digital Democracy and Fairplay:

“As any parent or guardian can attest, games and social media apps keep driving kids and teens to spend more and more time online, in a way that neither minors nor their guardians can reasonably prevent. This is neither accidental nor innocuous—it’s engineered and it’s deeply harmful. The FTC must step in and set some boundaries to protect kids and teens. The FTC should clarify that the most harmful and widespread design features that manipulate users into maximizing time online, such as those employed widely by social media services and popular games, are unlawful when used on minors.”

Groups signing on to the petition include: Center for Digital Democracy; Fairplay; Accountable Tech; American Academy of Pediatrics; Becca Schmill Foundation, Inc.; Berkeley Media Studies Group; C. Everett Koop Institute at Dartmouth; Center for Humane Technology; Children and Screens: Institute of Digital Media and Child Development; Eating Disorders Coalition; Electronic Privacy Information Center (EPIC); LookUp.live; Lynn’s Warriors; Network for Public Education; Parent Coalition for Student Privacy; ParentsTogether Action; Protect Young Eyes; Public Citizen; Together for Girls; U.S. Public Interest Research Group; and UConn Rudd Center for Food Policy and Health.

 

###