California Man Sentenced to 14 Years for Exploiting Girls on Discord

A person holding a smartphone with the Discord app open

A California man who used the messaging platform Discord to sexually exploit a dozen underage girls has been sentenced to 14 years in federal prison, People.com reported yesterday.

James Styner, 20, of Garden Grove, pleaded guilty earlier this year to coercion and enticement of a minor, distribution of child pornography, and three counts of receipt of child pornography. Prosecutors said his conduct began when he was 17 and continued until his arrest at 19.

According to prosecutors, Styner demanded explicit images from girls as young as 12 and distributed child sexual abuse material (CSAM) to others. He admitted to having “in-person sexual relationships” with at least two of his victims. As part of his plea agreement, Styner acknowledged conduct involving 12 identified victims, while noting there were more.

Prosecutors said Styner “engaged in a pervasive online campaign to manipulate and exploit vulnerable girls for his own sexual satisfaction.”

The sentencing includes seven years of supervised release, and Styner will be required to register as a sex offender.

The case was investigated by the Metropolitan Police Department-FBI Child Exploitation Task Force with assistance from the West Covina Police Department and the U.S. Attorney’s Office for the Central District of California.

“No man will be allowed to exploit, harm and victimize children under my watch,” U.S. Attorney Jeanine Ferris Pirro said. “They will be hunted down, prosecuted and then face the full weight of justice. Whether you are behind a screen or behind closed doors—we will find you and convict you.”

While this case involved Discord, it’s representative of a wider problem of predators exploiting popular online platforms, including:

  • Telegram, where bots capable of generating AI-created sexual abuse material have been discovered.
  • Instagram, identified by researchers as a hub for networks trading self-generated CSAM, sometimes aided by algorithms.
  • Twitter (now X), where lapses in detection left illegal material visible on public profiles.
  • Gaming platforms such as Roblox, which have been used by predators for grooming and sextortion schemes.

Advocates say companies need stronger safeguards, faster reporting systems, and better cooperation with law enforcement to stop predators from exploiting children online.

According to the National Center for Missing & Exploited Children (NCMEC), more than 36 million reports of CSAM were made in 2023, a record number. Experts stress that while parents can set rules and monitor devices, institutions and platforms must do more to prevent predators from operating in digital spaces where children gather.

The Styner case is a reminder that online platforms can be as dangerous as physical spaces when safeguards fail. It is also part of a growing national crisis around CSAM and online child exploitation.

If you or a loved one has been impacted by online exploitation or child sexual abuse, you are not alone. Visit our Institutional Sexual Abuse Guide to learn about your rights and how to seek justice.

GET A FREE CASE EVALUATION
no pressure. No obligation.

Knowledge Sparks Reform for Survivors.
Share This Story With Your Network.

Learn how we helped 100 top brands gain success