Preventing Child Sexual Abuse Material (CSAM)

Young girl sitting on a couch wearing a purple hoodie, focused on a smartphone in her hands, representing concerns about online safety for children

Child Sexual Abuse Material (CSAM), once called child pornography, is one of the fastest-growing threats to children worldwide. The internet has made it easier than ever for predators to exploit minors and circulate harmful content across platforms. According to the National Center for Missing & Exploited Children (NCMEC), more than 36 million CSAM reports were filed in 2023 alone [source]. Preventing CSAM requires vigilance not only from families but also from schools, youth organizations, and especially the companies that operate the platforms where exploitation occurs.

What Is CSAM?

CSAM includes any image, video, or digital content that depicts the sexual abuse of children. Posting, sharing, or even viewing CSAM is a crime. In recent years, predators have increasingly coerced teens into creating “self-generated CSAM” by pressuring them to share nude or sexual images of themselves. This tactic is often tied to sextortion and online grooming.

Platforms Under Scrutiny

Predators rarely operate in isolation. They exploit online platforms that allow anonymity, private messaging, and content sharing. Several major platforms have faced intense scrutiny for enabling CSAM:

  • Discord: In September 2025, a California man was sentenced to 14 years in prison for exploiting at least a dozen underage girls on the platform. Prosecutors said he demanded explicit images from girls as young as 12, distributed CSAM, and even had in-person sexual encounters with minors. The case highlights how predators manipulate children in spaces meant for gaming and community.
  • Telegram: SurvivorsRights.com has documented how Telegram hosts bots that generate AI-created sexual abuse material, raising major concerns about the company’s child safety measures. It remains to be seen if Telegram’s partnership with the Internet Watch Foundation will drastically cut down on CSAM.
  • Instagram: A Stanford Internet Observatory study revealed networks using Instagram to advertise and trade self-generated CSAM, with algorithms even recommending exploitative accounts.
  • Twitter/X: Researchers have found lapses in detection, with known CSAM remaining live on public profiles despite the platform’s use of image hashing.
  • Gaming platforms like Roblox and other live-chat services have also been linked to grooming, sextortion networks, and CSAM distribution.

When companies fail to enforce safety policies, predators thrive. Advocacy groups continue to call for stricter accountability from tech giants whose services remain vulnerable to exploitation.

How Parents and Institutions Can Help

While technology companies have a responsibility to prevent CSAM, parents, schools, and community organizations remain a critical line of defense. Prevention steps include:

  • Teaching children early that it is never safe to share nude or sexual images, even with someone they trust.
  • Monitoring privacy settings on apps and games, and knowing who your child interacts with online.
  • Explaining that predators may pose as peers their age to gain trust.
  • Encouraging open conversations so children feel safe reporting pressure, threats, or inappropriate requests.
  • Training teachers, coaches, and mentors to spot grooming behaviors both in-person and online.

Reporting CSAM

If you encounter CSAM online, never share or download the material. Report it immediately to the CyberTipline at missingkids.org/cybertipline or call 1-800-843-5678. You can also notify local law enforcement.

CSAM is not just a personal danger for children. It is a systemic problem fueled by gaps in platform oversight and institutional accountability. Families, schools, and organizations must stay informed, while tech companies must be held accountable when their platforms are used to harm children. By talking openly with kids, monitoring online spaces, and demanding stronger protections from digital platforms, we can help prevent CSAM and protect children from exploitation.

If you or a loved one has been impacted by child sexual abuse or online exploitation, know that you are not alone. Visit our Institutional Sexual Abuse Guide to learn more about your legal rights and the steps you can take to protect your family and seek justice.

Are you a survivor who doesn’t know where to turn for legal help?

Fill out the brief, confidential form so that we may help connect you with an empathetic attorney who will help you understand what your options are. There is no pressure or obligation.

Knowledge Sparks Reform for Survivors. Share This Story With Your Network.

Learn how we helped 100 top brands gain success