Gaggle Speaks

Ideas, news, and advice for K-12 educators and administrators to help create safe learning environments.

Student Safety

Pornography or Exploitation? Why Words Matter


close
Written by Lisa Railton
on March 15, 2021

While legal terminology still refers to visual depictions of sexually explicit conduct involving minors as child pornography, many organizations and advocates around the globe are steering away from that language. These materials are neither legal nor consensual—they are abuse and exploitation, not pornography.  

Referring to these materials as pornography is damaging to the underage victims, who suffer each time someone views videos or images of their abuse. Outside of the legal system, organizations like the National Center for Missing & Exploited Children (NCMEC) and Child Rescue Coalition prefer the term child sexual abuse materials (CSAM) to more accurately describe this illegal content. 

Gaggle Safety Management helps safeguard both students and districts when it comes to incidents involving CSAM. Our machine learning technology identifies images that are likely to be explicit and flags them for further analysis. These images are then reviewed by our team of safety professionals, who have been trained to properly handle sexually explicit content and interact with NCMEC.

If our team suspects minors are being exploited or abused, the content is reported to NCMEC as required, protecting schools from the risks and liability involved in handling these materials. NCMEC strives to reduce child sexual exploitation and prevent child victimization, working with victims, families, law enforcement, and companies like Gaggle to help keep students safe. When content is reported to them through their CyberTipline, NCMEC tags CSAM for removal to make sure the files are not distributed and the minors involved are protected.

Gaggle will remove the ability to access and share content suspected of involving child abuse and exploitation from district Google Drive accounts and retain any files in question for 90 days, as required by law. Our goal is to help protect the students by not allowing the spread of inappropriate content—and the district from the hassle and liability that can accompany the sharing of this content.

During the first few months of the 2020–21 school year, Gaggle recorded a 135% increase in incidents involving nudity and sexual content, which includes content classified as CSAM. In addition, we made 3,952 reports of possible CSAM to NCMEC in 2020, helping to prevent further exploitation of underage victims. 

Whatever you call it, it’s a crime. However, words matter, and using the preferred terminology helps people understand just how harmful these materials are to the children who are being exploited. 

Let Us Know What You Thought About This Post.

Put your comment below.

You may also like:

Student Safety Student Mental Health Cyberbullying Gaggle Therapy

Creating a Culture of Safety: How School Leaders Can Make a Difference

In today’s rapidly evolving world, fostering a safe and supportive school environment is more critical than ever. Studen...

Student Safety Student Mental Health Cyberbullying Gaggle Therapy

Mental Health Crisis in Elementary Schools: Addressing the Unseen Struggles

The conversation around mental health in schools has often focused on older students—those in middle school, high school...

Student Safety Student Mental Health Cyberbullying Gaggle Therapy

Breaking the Silence

In today’s digital age, children and teens are facing unprecedented challenges, many of which have a significant impact ...