Contact: David Monahan, Fairplay [email protected]
New report shows Meta profits from pushing pro-eating disorder content to children on Instagram
BOSTON–Thursday, April 14, 2022– New research has revealed that Instagram’s algorithm is promoting and helping to grow a pro-eating disorder “bubble,” which currently reaches around 20 million users. Designing for Disorder, a new report from the advocacy organization Fairplay, documents how 33.75% of the accounts in this bubble are underage, including many users as young as nine years old.
Instagram parent company Meta (formerly Facebook) has been aware of Instagram promoting pro-eating disorder content to minors since at least 2019, but Meta continues to profit from this bubble. Designing for Disorder identifies 153 popular accounts that celebrate “thinspiration” or “bonespiration,” such as positive imagery of extremely underweight people or other eating disorder memes. Researchers found that 1.6 million unique users followed these accounts, and 90,000 followed three or more of these accounts. People following three or more accounts frequently followed each other, creating a ‘pro-eating disorder bubble’ that is worth at least $1.8 million per year for Meta, and the revenue generated from all users following this bubble is $227.9 million per year.
“Meta is promoting and profiting from the disturbing content in this pro-eating disorder bubble and the young users who have become tangled in its orbit,” said report author and privacy expert Rys Farthing, PhD. “This data is proof positive that Meta is putting its profits before the wellbeing of children and teens.”
The report includes the first-hand account of Kelsey, a 17 year-old eating disorder-survivor-turned-activist. “At the height of my eating disorder, I used social media as a fuel for my obsession with weight loss. The content they recommended to me of perfect toned bodies and tips for weight loss motivated me to continue when I was at my worst,” said Kelsey. “It was up to me to actively try and change my social media feeds – I had to do the hard work. This content was just always in my feed already, and somehow it was my responsibility to get it out. Even now, I have to take active steps to stop the algorithm recommending this content – Instagram pushes me towards this content, and I have to actively pull myself away from it.”
The report comes as the movement to protect children from Big Tech’s harmful business model continues to grow. Next week, the California Assembly Privacy and Consumer Protection Committee will hold a critical hearing on California’s Age Appropriate Design Code Bill, which seeks to replicate the UK’s groundbreaking approach for protecting children online. Meanwhile, the US Senate is considering important bipartisan legislation – the Kids Online Safety Act and the Children and Teens’ Online Protection Act – that would usher in a new set of safeguards for young people.
“In both California and at the federal level, we have important legislation that would require companies like Meta to put children’s best interests first,” said Fairplay’s Executive Director, Josh Golin. “We implore lawmakers to move these bills quickly to the finish line. Every day of inaction means more young people are being harmed by Big Tech’s unregulated and predatory business model.”
“The offline consequences of putting powerful technologies into the hands of children must be carefully considered,” said Professor Hany Farid, Head of School at University of California, Berkeley’s School of Information. “There are reasonable safeguards that can be put in place to protect young people, such as requirements for products to be safe for children by design. Proposed legislation such as the California Age Appropriate Design Code Bill and the Kids Online Safety Act are a step in the right direction, and worthy of serious deliberation.”