Children’s advocacy organizations Fairplay and Center for Digital Democracy respond to today’s FTC proposed prohibition preventing Meta from monetizing youth data
BOSTON, MA and WASHINGTON, DC — May 3, 2023 — Today, the Federal Trade Commission proposed changes to its standing 2020 privacy order with Meta, and alleged that the company failed to protect its users’ privacy and misled parents about privacy controls available on its Messenger Kids app.
Central to the proposed changes is a prohibition on Meta’s ability to monetize the data of users under 18, a move that has been lauded by privacy and children’s advocates Fairplay and Center for Digital Democracy (CDD). Advocates have pointed to Meta’s reckless treatment of young users as exemplifying the urgent need for federal action to protect children and teens online. In the past, Meta has touted its ability to target teens with specific ads when they’re most vulnerable, and a report from Fairplay illustrated that the company has made millions of dollars from promoting pro-eating disorder content to children as young as 9.
The FTC’s action comes after a 2018 letter to the Federal Trade Commission from Fairplay and CDD alleging that Meta (then called Facebook) was violating the Children’s Online Privacy Protection Act (COPPA) with its Messenger Kids app, given that it failed to obtain verifiable parental consent for children under 13 using the app and that it was not forthcoming about its data practices. Fairplay also led a coalition of experts of advocates who urged Meta to pull the plug on Messenger Kids because of privacy and child development concerns.
“The action taken by the Federal Trade Commission against Meta is long overdue,” said Josh Golin, Executive Director, Fairplay. “For years, Meta has flouted the law and exploited millions of children and teens in their efforts to maximize profits, with little care as to the harms faced by young users on their platforms. The FTC has rightly recognized Meta simply cannot be trusted with young people’s sensitive data and proposed a remedy in line with Meta’s long history of abuse of children. We applaud the Commission for its efforts to hold Meta accountable. We also call on Congress to pass the Children and Teens’ Online Privacy Protection Act, because all companies should be prohibited from misusing young people’s sensitive data, not just those operating under a Consent Decree.”
“Today’s action by the Federal Trade Commission (FTC) is a long-overdue intervention into what has become a huge national crisis for young people. Meta and its platforms are at the center of a powerful commercialized social media system that has spiraled out of control, threatening the mental health and wellbeing of children and adolescents,” said Jeff Chester, Executive Director, Center for Digital Democracy. “The company has not done enough to address the problems caused by its unaccountable data-driven commercial platforms. Amid a continuing rise in shocking incidents of suicide, self-harm and online abuse, as well as exposés from industry “whistleblowers,” Meta is unleashing even more powerful data gathering and targeting tactics fueled by immersive content, virtual reality and artificial intelligence, while pushing youth further into the metaverse with no meaningful safeguards. Parents and children urgently need the government to institute protections for the “digital generation” before it is too late. Today’s action by the FTC limiting how Meta can use the data it gathers will bring critical protections to both children and teens. It will require Meta/Facebook to engage in a proper “due diligence” process when launching new products targeting young people—rather than its current method of “release first and address problems later approach.” The FTC deserves the thanks of U.S parents and others concerned about the privacy and welfare of our “digital generation.”
The advocates’ reports can be found below:
META HAS A LONG HISTORY OF FAILING TO PROTECT CHILDREN ONLINE