Student Trust and Privacy Center

Gaggle is dedicated to keeping all student and staff data secure. 

Your Privacy Matters

Student Privacy Pledge Badge AICPA SOC 2 Badge

 

 

Common Sense - Privacy Program Logo - V2

 

Microsoft Badge Google Workspace for Education Premier Partner Badge

Gaggle’s staunch commitment to supporting student safety without compromising privacy has earned the trust of K-12 districts nationwide.

In early 2021, Gaggle demonstrated its promise to protect student privacy by signing Pledge 2020, a new Student Privacy Pledge from the Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) that challenges companies to adhere to the latest and most stringent data privacy best practices.

In addition, Gaggle is dedicated to approaching artificial intelligence ethically. We know how critical it is to serve all students equitably and strive to ensure our technology is safeguarding students in a fair and appropriate manner.

Balancing Student Safety and Privacy

Gaggle Prioritizes Data Privacy

As part of our effort to prioritize data privacy, Gaggle will:
  ALWAYS NEVER

Use data for authorized education purposes only

GreenCheck  

Enforce strict limits on data retention

GreenCheck  

Support parental access to, and correction of errors in, their children’s information

GreenCheck  

Provide comprehensive security standards

GreenCheck  

Be transparent about the collection and use of data

GreenCheck  

Sell student or staff information

  RedX

Behaviorally target advertising nor show advertising to any user

  RedX

Change privacy policies without notice

  RedX

Our Commitment to Privacy

Gaggle and our district partners are united around one shared goal: to help keep students safe. And that includes protecting their personal information, as well as preserving the trust and responsibility that comes with the sensitive nature of our work.
federal building icon
Federal Privacy Laws
Gaggle adheres to all relevant federal privacy laws, including the Children’s Internet Protection Act (CIPA), Health Insurance Portability and Accountability Act (HIPAA), and the Family Educational Rights and Privacy Act (FERPA).
computer with xxx on the screen symbolizing pornography
Pornographic Content
When it comes to handling sexually explicit images and videos, Gaggle adheres to the requirements of the National Center for Missing & Exploited Children (NCMEC) and the Internet Crimes Against Children (ICAC) Task Force.
icon with person on symbolizing student ID
Student Information
We minimize the exposure of a student’s personally identifiable information (PII) as much as possible within the system and with our team members. All student information is anonymized.
icon of talk bubble with silhouette of person
Private Accounts
Gaggle does not monitor students’ private social media accounts or private email accounts. Gaggle only monitors content that is produced utilizing school-owned email addresses or online tools within Google Workspace for Education, Microsoft 365, Google Chat, Microsoft Teams, and the Canvas learning management system.
icon of graduation cap
Rigorous Training
Gaggle requires its Safety team employees to complete rigorous training, including implicit bias training, to mitigate any false flags and help ensure our team is well-informed about emerging areas of sensitivity and concern.
icon with checkmark inside of circle inside of starburst
Privacy Pledge
We were one of the first technology companies to sign on to the original Student Data Privacy pledge from the Software and Information Industry Association and the Future of Privacy Forum, and we also promptly signed the revamped Pledge 2020, which has more stringent privacy requirements.

Gaggle takes data security very seriously

All data is stored in secure data centers as well as on the Amazon Web Services cloud. In 2021, Gaggle completed the most stringent third-party security audit—a SOC 2 Type 2 audit. Our assessors’ review of our technology and practices resulted in a final SOC 2 report free of any disclosures, which evidences Gaggle’s unwavering commitment to information security and keeping our customers’ data safe. In addition, Gaggle undergoes regular penetration testing by third-party auditors and employs multi-factor authentication for our staff.

Gaggle adheres to FERPA regulations

Gaggle only uses PII from students’ education records to enable the use of Gaggle solutions. Gaggle redacts students’ PII in the interface and limits back-end access. The only PII that is ever visible to Gaggle is the basic details necessary to create students’ user accounts, including name, grade level, and school district. That information is visible only to the top-tier of the Gaggle Safety Team when they need to notify a school district official of a serious incident or when a threat is identified. Unless a school official expressly instructs otherwise, we will not share or reuse PII from education records for any other purpose.

Gaggle’s services are in compliance with COPPA

Individual children are not allowed to sign up for any Gaggle solutions—the only way a child may obtain access to a Gaggle solution is through their school in loco parents. In addition, each school district is responsible for creating student accounts for any Gaggle solution.


Frequently Asked Questions
For School Districts
Can Gaggle demonstrate the efficacy of its products?
Each year, we share our findings from the most recent school year in our annual Through the Gaggle Lens: The State of Student Safety report. This report dives into what we saw during the school year, the alerts we sent to districts about students in crisis, and the types of troubling incidents we flagged, offering insight into the state of student safety as well as the successful interventions resulting from our services. But don’t just take our word for it—read our Case Studies to learn more about how we partner with K-12 school districts to help enhance student safety and the results our district partners are seeing.

 

How does Gaggle generate and send alerts, and who receives the alerts?
Gaggle leverages a combination of artificial intelligence, machine learning, and a highly trained content review team to flag concerning content that K-12 students may receive, send, or create using school-provided technology. Gaggle uses a combination of keywords, algorithms, and machine learning to identify content that indicates students planning self-harm, bullying, abuse, or school violence.

If a concerning item is identified by Gaggle’s technology, a member of our trained human review team (the Gaggle Safety Team) will analyze the item to assess context and determine whether it is a false positive or if it might be something of concern. If it is determined that the content in question indicates a mental health issue, a threat, or potential harm to a student, it is escalated to a second Gaggle Safety Team member who will look more carefully at the item and full context, categorize the concern, and determine the severity and urgency.

From this point, the following actions occur:
  • Non-Urgent Email Alert: If the second Gaggle Safety Team member determines the item is concerning but not urgent, an email alert will be sent to the emergency contacts at the school district.

  • Urgent Telephone Contact: If the flagged item suggests an immediate threat to a student or someone else, the Gaggle Safety Team member will contact by phone the school district’s designated emergency contacts, who could be a superintendent, a counselor, a principal, or other school official. With guidance from Gaggle, schools set their own emergency contact processes and policies.

  • Police Wellness Check (only in extremely urgent circumstances): If there is no response from the district-appointed emergency contacts after three contact attempts and a child’s life appears to be in imminent danger, Gaggle will reach out to the local authorities for a wellness check. Gaggle avoids contacting law enforcement directly unless there is an extreme circumstance where the Gaggle Safety Team is unable to reach the school-designated emergency contact and the flagged content indicates a child may be in immediate danger.

  • Federal Law Enforcement Alert (only in instances of child sexual abuse materials): If the item contains an image that appears to portray a sexually explicit picture of a child, the item is registered with the NCMEC, as required by federal law. The district-appointed emergency contacts are notified about the existence of the image, but not sent the image in question.

 

What content does Gaggle review?
Gaggle provides a monitoring solution for Google Workspace for Education, Microsoft 365, and the Canvas learning management system. Items monitored include students’ school-provided email accounts, document creation and collaboration on school-provided educational platforms, calendar entries, chat, and other direct and group communication tools used as part of a learning or educational setting.

 

What does Gaggle monitor, and who can purchase the service?
Gaggle does not monitor students’ web browsing activities or students’ private social media accounts, nor is the tool available for consumer purchase and use. Gaggle is only available for purchase to schools and districts to be used on school-issued devices and accounts.

 

How long does Gaggle retain data?
Gaggle purges non-incident data (i.e., data that does not indicate a threat of harm) after 30 days. Incident data is retained for individual school districts until one of the following actions occurs: (a) the relationship with Gaggle is terminated, (b) the student in question graduates or withdraws from the district, or (c) the school district requests a full data purge. All student information is anonymized.

 

Are Gaggle’s keywords and content categories for alerts appropriate and useful for the purpose of self-harm monitoring?
If our technology identifies something concerning—such as the use of a keyword, flagged phrase, or troubling image—that content is then reviewed by a minimum of two Gaggle Safety Team members in order to verify the content, assess the context, and determine the level of concern. If our Gaggle Safety Team determines the flagged content is life-threatening, a member of the team immediately alerts the designated school emergency contact by phone.

Equitable prevention practices are paramount to providing students with a fair education and the support they need. Our algorithm reviews content anonymously, so we have no context or background on students when we identify potential issues. During onboarding and training for our Gaggle Safety Team, we include topics related to bias and opinion, as well as the importance of separating these from decision-making related to items they review.

Occasionally, districts identify trends or slang that may be unique to their community of students, and Gaggle will create additional monitoring filters to capture this content. We believe it’s important that the technology remain nimble and adapt to students’ ever-changing ways of communicating.

 

How does Gaggle recognize context?
If a concerning item is identified by Gaggle’s technology, a member of our trained human review team (the Gaggle Safety Team) will analyze the item. The Gaggle Safety Team undergoes extensive internal training with our experienced reviewers. The Gaggle Safety Team supervisors have an average of five years’ history with Gaggle analyzing content and additional training from outside organizations, including the NCMEC, the ICAC Task Force, Shift Wellness, and others.

A Gaggle Safety Team member will analyze the flagged piece of content to assess context and determine whether it is a false positive or if it might be something of concern. They will do this by looking at the text through the eyes of human intelligence and reading the surrounding text beyond the sentence in question.

If it is determined that the content in question indicates a mental health issue, a threat, or potential harm to a student, it is escalated to a second Gaggle Safety Team member who will look more carefully at the item and full context, categorize the concern, and determine the severity and urgency.

 

How does Gaggle process student content in non-English languages?
Gaggle uses a direct integration with a translation service in our interface, allowing our Gaggle Safety Team to translate content immediately when making a decision about student safety and well-being.

 

How does Gaggle interpret pictures?
Gaggle’s proprietary anti-pornography scanner (APS) offers unique real-time image analysis that detects and prevents pornographic images. The APS scans embedded and attached images as well as images in attachments like Word and PowerPoint documents. Even URL links to other websites are scanned for offensive content.

 

Does Gaggle monitor students’ social media accounts?
Gaggle does not monitor students’ private social media accounts. Gaggle only monitors content that is produced utilizing school-owned email addresses or online tools within Google Workspace for Education, Microsoft 365, Google Chat, Microsoft Teams, and the Canvas learning management system.

 

Does Gaggle turn over data to law enforcement?
Most of the content identified by our technology is ultimately benign and does not require any engagement from the school emergency contact. In about 0.0036% of cases, our team believes that the content is sufficiently worrisome that an adult needs to be made aware and these items are labeled as incidents. In very, very rare instances—approximately 0.41% of these incidents, and only if we cannot reach the school-designated emergency contact after multiple attempts and believe a child is in serious crisis—we will contact the local authorities for a wellness check.

Gaggle only alerts law enforcement for an immediate wellness check when information indicates that a child’s life is in imminent danger and district-appointed emergency contacts are inaccessible. We know engaging law enforcement in any student situation is a last- and worst-case scenario, and take every step possible to avoid it unless a child’s life hangs in the balance.

 

For Parents

Can I buy Gaggle Safety Management for my child at home?

Gaggle is only available for purchase by school districts. It is not a consumer product available for parents to purchase for their personal use.

 

Do you provide any additional resources that we can use at home to help keep our kids safe?
In addition to more than two decades of experience working with schools, we can also help by providing in-person and online programs designed for children and their parents about potential online dangers. Please contact your school directly and ask them to contact us—we can provide them with a list of digital citizenship resources. In addition, Gaggle maintains a list of Popular Apps and Social Networks, which offers up-to-date details on the latest apps and networks your kids are using.

 

My child is a good kid, so why is there a safety concern?

Gaggle is not in place to be used punitively—kids just need guidance sometimes. Whether it’s an opportunity to teach digital citizenship or a cry for help, we’re here to support student safety and well-being around the clock. If Gaggle can help schools identify students in need, then we can all sleep better knowing that all kids are being safeguarded.

 

Are you invading my child’s privacy?

Most educators and attorneys will tell you that when your child is using school-provided technology, there should be no expectation of privacy. In fact, your child’s school is legally required by federal law (CIPA) to protect children from accessing obscene or harmful content over the internet.

 

For Students
What is Gaggle?
Gaggle is a school safety company. Our monitoring service uses a powerful combination of technology and trained safety experts to provide real-time analysis and review of students’ use of school-provided digital accounts and platforms. We help ensure schools are in compliance with a federal law that requires schools to filter unsafe content from kids online.

We keep watch over content produced through school-issued accounts (e.g., school email address or online tools within Google Workspace for Education, Microsoft 365, Google Chat, Microsoft Teams, and the Canvas learning management system) to help identify students who are in need of help. Gaggle also offers a therapy service to help districts provide students with the ongoing support they need.

 

Why is my school using Gaggle?
Your school is using Gaggle to help keep you—and your fellow students—safe. As our nation struggles with a youth suicide pandemic and mental health crisis, school leaders are investing in tools and programs to help protect all students and support well-being. Gaggle is an education technology company and a recognized leader in helping K-12 districts manage student safety on school-provided devices and platforms.

 

Why are you looking at what I do online?
Some kids don’t know how to ask for help, and their friends, family, and teachers may not even know how much they are struggling. Gaggle monitors all of the content produced on your school-issued accounts to help ensure that every single student in the district receives the same level of support and safety.

If our technology identifies something concerning—use of a keyword, flagged phrase, or troubling image—that content is then reviewed by the Gaggle Safety Team to verify the content that was flagged, understand the context of what was shared, and determine the level of concern and appropriate next steps. Our team includes trained social workers, crisis interventionists, educators, AI/technology developers, and therapists, all focused day in and day out on helping protect students in the online environment.

It’s worth noting that the Gaggle Safety Team reviews just a fraction of the content created by students and weeds out the items that don’t indicate a threat to student safety and well-being. We only contact school officials when we believe a student needs help.

Are you trying to get me in trouble?
Absolutely not—and neither is your school district. Gaggle is not in place to be used punitively. We’re looking for the kids who need help, so we alert school officials when we discover signs of self-harm, depression, thoughts of suicide, substance abuse, cyberbullying, unhealthy relationships, and credible threats of violence against others.


Most of the content identified by our technology is resolved as appropriate or not a threat and does not require any engagement from the school emergency contact. But in less than 1% of cases (about 0.0036% during the 2020–21 school year), our team believes that the content shows signs of a student in distress and determines that an adult needs to be made aware of the situation. During the 2020–21 school year, our alerts helped save the lives of 1,408 students who were in the process of attempting or planning suicide. These students desperately needed support, and we are thankful that we were able to hear their cries for help.

 

"Last school year, we analyzed about 10 billion items. It’s important to say this is the data that belongs to the schools. We are signers on the Student Data Privacy Pledge that’s put out by the Future Privacy Forum, so we don’t use the data for any purposes other than for school safety. The data is kept in secure systems and purged at the request of the districts."

Diversity, Equity, and Inclusion

At Gaggle, a diverse, inclusive, and equitable workplace is one where all employees and customers, whatever their gender, race, ethnicity, national origin, age, sexual orientation, identity, education, or disability, feel valued and respected. We are committed to a nondiscriminatory approach and provide equal opportunity for employment and advancement in all of our departments, programs, and worksites. We respect and value diverse life experiences and heritages and ensure that all voices are valued and heard. We’re committed to maintaining an inclusive environment with equitable treatment for all.