Get Gaggle out of schools today

When protecting students goes wrong

Most schools are working diligently to protect students, especially with the explosion of school-issued devices during the pandemic and the simultaneous rise in mental health issues. But some computer safety companies are exploiting schools’ desires to protect their students and actually creating a whole new set of risks. A startling new exposé from The 74 Million reveals that one of those companies, Gaggle, “appear[s] to be more interested in business growth and profit than protecting kids.” 

Gaggle is a “pioneer in helping K-12 districts manage student safety on school-provided technology” (Gaggle, 2022). Their software is used to scan school issued computers (including student emails and texts) and flag potential issues like cyberbullying, violence, and potential self-harm. Over 1500 school districts are using Gaggle. Now, Gaggle has been exposed for sketchy business practices that put children in harm’s way. That’s 5 million children being monitored by the company and at risk. 

If you’re worried about your child’s school using Gaggle, scroll down to the bottom of this blog for information on what you can do about it!

How Gaggle works against kids’ safety

At Fairplay, we are deeply invested in protecting children in technological environments. We believe that tech companies should design online environments with children’s best interest at the forefront. We also believe that students should be able to make mistakes, live their lives, and have their mental health supported by real people in real time. While the ability to potentially identify students in need of assistance sounds great in theory, in practice it subjects students to relentless surveillance.

Gaggle uses an algorithm to flag references to sex, drugs, and violence. Every email, text and interaction using a school device or account is scanned for potential troubling words. Once the algorithm flags one of the many “dangerous” keywords within a child’s email, school assignments, or text messages on school-issued devices, it goes to the virtual desk of a moderator who decides whether or not it is truly a problem, then raises it to school officials or sometimes police.

Gaggle’s surveillance model promises safety, but many students say that they do not know that they are being surveilled, and it is not clear to families or students what is done with their data. What’s more, schools are being hoodwinked into thinking that the level of surveillance Gaggle provides is imperative to students’ wellbeing. In an 2019 examination into the company’s practices, Buzzfeed News’s Caroline Haskins wrote, “Student surveillance services like Gaggle raise questions about how much monitoring is too much, and what rights minors have to control the ways that they’re watched by adults.” Gaggle has couched their quest for profit in their mission of protection. And the practice has even greater consequences in light of a growing mental health crisis, which is not going to be made better by spying on children. 

What Gaggle’s bad surveillance practices look like

Gaggle uses unfairly treated workers to monitor our kids’ lives.

Is the best way to stop students from sexting really to have their nude images end up in the hands of adult strangers who haven’t been trained to deal with such sensitive material? Gaggle’s first line of defense are their “level one” moderators, who see all of the content from children’s school-issued devices and get alerts when words are flagged. Unfortunately, these moderators severely lack training, with no required expertise in child development, mental health, or tech safety. The moderators (who say they are consistently traumatized by what they see) are expected to look at 300 flagged items per hour; overworked and underpaid, these workers cannot be expected to adequately provide safeguards for our children. What’s more, the company does not provide mental health benefits to workers in these roles, leading to harm not just for children but for the workers themselves. And to make matters worse, there are few safeguards. 

Gaggle does not safeguard against our children’s materials being leaked (a.k.a. strangers have nude photos of children)

Gaggle’s contracted moderators mention that content constantly shows up on their screens as new items get flagged; effectively, the virtual workers can be anywhere when nude photos of children pop up on their phones. Currently, there is nothing to stop them from keeping or selling those pictures. And while Gaggle’s Student Data Privacy Policy says they maintain confidentiality of student data and put ample protections in place, they do not safeguard against moderators’ access to this information. 

What’s more, Gaggle plugs into Google Suite and Microsoft 365, which regularly leak student data and are easily hacked (Buzzfeed News). Weak protections put our kids further at risk. While it would be easy to blame the moderators for this behavior, the company itself neglects any training or support for these workers, furthering its sketchy behavior.

Gaggle’s algorithm is biased, putting kids in harm’s way

During their review of student devices, moderators are expected to use “common sense” to distinguish red flags, but this leaves ample room for bias. While they claim that they work to eliminate racial bias by not coding students information by demographics, Gaggle has no safeguards against bias, especially with managers deciding whether it’s important to level up the issues to school authorities or police.

When interviewed, one moderator said that they were consistently outing teens who were simply expressing their identities in what they thought were personal conversations; the algorithm flags keywords like “gay,” “queer,” and “lesbian” as dangerous. In a letter to the Senate in March, the company skirted their question when they were asked directly if they are working to eliminate bias against LGBTQ+ youth, saying “LGBTQ+ students often suffer from depression, anxiety, and suicide ideation at higher rates than other populations,” but not addressing their repeated incidents of outing children, which has deep psychological impacts and can incite violent retaliation. This has huge implications for the LGBTQ+ youth in our schools.

Gaggle has taken advantage of real issues such as the mental health crisis for the sake of profit. School districts are expected to fork over big bucks to implement Gaggle, with most of those dollars coming out of their federal allocations (Buzzfeed News). This runs the risk of displacing school support services like counselors that have a much better chance of actually supporting children and teens with their mental health than an algorithm. We need protections for children, but not at a greater expense of their wellbeing.

Take Action: Kick Gaggle to the Curb

As Gaggle pushes its agenda in new school districts, now is the time to take a strong stance against Gaggle’s profiteering via unfulfilled protection.

Making the call to end your school’s relationship with Gaggle is key to your child’s safety. Contact your district’s superintendent, technology director, or school principal and let them know your concerns about this unsafe surveillance system. Urge officials to offer an option to opt out for your family.

And now, we’re making it easy for you to take action against Gaggle in your own community!

Copy and paste the email template below and modify it so it fits your local situation. Email the template to other parents and teachers to gain support, collect signatures, and send it to school officials. Remember — there is power in numbers. If you feel you need more background or tools before meeting with officials, contact us and check out our Screens in Schools Action Kit.



We urge [SCHOOL DISTRICT] to discontinue its use of Gaggle during the [years] school year. We recognize your hard work to keep our students safe and that safety and protection are at the forefront of all you are doing. However, Gaggle is working in direct opposition to our students’ safety, outweighing any potential protection. We urge [SCHOOL DISTRICT] to prioritize student safety through human services personnel or other means of protection.

We urge [SCHOOL DISTRICT] to consider the evidence outlined in Fairplay’s recent article “*ADD IN TITLE*” to guide their decision. Fairplay’s statement outlines the major safety concerns with Gaggle, including a dangerous surveillance system that in which:

  • Gaggle uses unfairly treated workers to monitor our kids’ lives.
  • Gaggle does not safeguard against our children’s materials being leaked
  • Gaggle’s algorithm is biased, putting kids in harm’s way. 

Gaggle is seeking profit from our school at the detriment of our students’ mental health, and we urge you to end the contract with them immediately.

We would welcome the opportunity to discuss this with you further. Thank you so much for your commitment to our children’s wellbeing during these challenging times.


Good luck! We’d love to hear from you about your experiences with Gaggle and advocacy in your school district! Send us an email, or share your story on social media using #getgaggleout.