Here at Gaggle, we believe that all schools should be safe and all students should get the mental and emotional help they need. Our mission is to help ensure the safety and well-being of students and schools by leveraging people and technology, supporting school districts in proactively identifying those who are struggling.
We take student safety very seriously. We work in partnership with schools and districts across the country, offering administrators, teachers, and parents the peace of mind that students are being protected—24 hours a day, seven days a week, 365 days a year.
Gaggle analyzed more than 10.1 billion items during the 2020–21 school year. Of these, more than 41 million items required human review by our Gaggle Safety Team for context, resulting in over 360,000 items that warranted urgent action by the school or district. More than 142,000 incidents we flagged were references to suicide or self-harm—each one of these is a cry for help.
Around-the-Clock Student Safety
Since 1999, Gaggle has strived to ensure the safety and well-being of the students we serve. Safeguarding these students is the golden rule to which everything else comes second. A recognized leader in helping K-12 districts manage student safety on school-provided technology, we have helped thousands of districts avoid tragedies and save lives.
Our powerful technology, combined with the Gaggle Safety Team, provides real-time analysis and review of students’ use of school-issued online collaboration platforms. With more than 41 million student items requiring human review for context during the 2020–21 school year, we need to have a large team of individuals who are available at all different times of the day and night to be able to quickly respond to the urgent needs of students across the country. Our team consists of hundreds of safety representatives who review the alerts coming from our technology around the clock. The alerts are then further analyzed by a core group of trained safety professionals in order to verify the content, understand the context, and determine the level of severity.
Currently serving 5.5 million students across the country, a core member of our Gaggle Safety Team will contact school officials when students show signs of self-harm, depression, thoughts of suicide, substance abuse, cyberbullying, unhealthy relationships, and credible threats of violence against others.
Addressing Privacy Concerns
Gaggle complies with HIPAA and FERPA guidelines and was one of the first companies to sign the Student Privacy Pledge. In addition, Gaggle maintains clear terms regarding how we treat student and staff data. Year after year, we’ve pledged to reinforce our commitment by aligning with the Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) to advance data privacy protection regarding the collection, use, and maintenance of personal information.
It is Gaggle’s balance between student safety and student privacy that has K-12 districts across the United States using Gaggle year after year. Gaggle does not monitor students’ personal social media accounts or personal email accounts. Gaggle firmly believes in a student’s right to privacy and recommends that districts notify students and parents of their Gaggle implementation.
Supporting Vulnerable Populations
LGBTQ+ students often suffer from depression, anxiety, and suicide ideation at higher rates than other populations. A recent study by The Trevor Project showed that 42% of LGBTQ youth had seriously considered attempting suicide in the past year. In addition, LGBTQ+ youth are more than four times more likely to seriously consider suicide than their peers. Helping all students struggling with these issues is one of our top priorities.
Disparities also exist in regard to mental health support among BIPOC communities. In fact, less than half of Black students feel they have adequate mental health support in their schools. With this in mind, it’s important to consider the ways we can support these underserved students facing mental health struggles.
Gaggle is dedicated to approaching artificial intelligence ethically. We know how critical it is to serve all students equitably and strive to ensure our technology is safeguarding students in a fair and appropriate manner. Our algorithm reviews content anonymously, so we have no context or background on students when we identify potential issues, ensuring that all students get the support they need—regardless of demographic factors like race, income level, and sexual orientation.
During onboarding and training for our Gaggle Safety Team, we include topics related to bias and personal opinion as well as the importance of separating these from decision-making related to items they review. Our Gaggle Safety Team has undergone anti-bias training with Tekoa Pouerie, a Certified Implicit/Explicit Bias trainer who educates individuals on how they can recognize and reduce bias in order to improve social and racial equity. We work with Pouerie on an ongoing basis to support our anti-bias training across the company. Our Leadership team has already undergone this training, and the entire company will be doing so as well in June.
Data Security
Data security is very important to us. As such, Gaggle never shares anonymized or aggregated student data with any third parties. Gaggle’s data is stored in AWS and our own dedicated data center. Files are stored in an encrypted format, all communication is over secure SSL, and all passwords are hashed. Data is retained for varying lengths of time depending upon the contract with the customer. When services are canceled, data is purged.
Gaggle recently completed the most stringent third-party security audit, called a SOC 2 Type 2 audit. In addition, Gaggle undergoes regular penetration testing by third-party auditors and employs multi-factor authentication for our staff.
What’s more, Gaggle helps school districts comply with CIPA by providing security of school-provided technology to all students in the district where Gaggle is implemented. We also help protect school districts against risks and liability issues related to sexually explicit images by complying with requirements set by the National Center of Missing & Exploited Children (NCMEC) and Internet Crimes Against Children (ICAC).
See what districts are saying about Gaggle:
Case Studies
Gaggle Partner Districts in the News
Let Us Know What You Thought About This Post.
Put your comment below.