New Bipartisan Bill Aims to Prevent AI From Generating Child Sexual Abuse Material

Pictured is U.S. Senator John Cornyn, R-TX, co-author of Preventing Recurring Online Abuse of Children Through Intentional Vetting of Artificial Intelligence (PROACTIV AI) Data Act
Summary: AI-generated child abuse images are skyrocketing. A new bipartisan bill targets how tech companies train AI—and how to stop predators from exploiting the code.

As artificial intelligence rapidly advances, lawmakers are turning their attention to a disturbing development: the use of AI tools to generate child sexual abuse material (CSAM). In response, a new bipartisan bill introduced by U.S. Senators John Cornyn (R-TX) and Andy Kim (D-NJ) has been introduced to address how AI companies train their models and what safeguards are in place to prevent the exploitation of children, a July 22 press release from Senator Cornyn’s office stated.

The Preventing Recurring Online Abuse of Children Through Intentional Vetting of Artificial Intelligence (PROACTIV AI) Data Act would encourage AI developers to proactively identify, remove, and report known CSAM from the massive datasets used to train image-generation tools. According to the bill’s sponsors, the legislation is designed to stop AI platforms from unintentionally becoming tools for creating explicit material involving children.

“Modern predators are exploiting advances in AI to develop new AI-generated child sexual abuse material, and technology companies are often unwittingly giving them the tools to do so,” said a sponsoring senator. “By encouraging tech companies to proactively screen their datasets to remove and report explicit images of children, this legislation would mitigate the risk of AI platforms unintentionally enabling the creation of new content depicting child sexual abuse and better safeguard children online.”

“As we develop AI models, it is important that we establish critical protections to look out for the most vulnerable in digital spaces,” said another co-sponsoring senator. “This bill is an opportunity for Congress and AI developers to take an important step forward together and implement the necessary safeguards to keep our children safe from future misuse or exploitation.”

The concern arises from how foundational AI models are trained. These models depend on large datasets—often scraped from the internet—and many companies do not adequately screen the content due to the enormous size and processing demands. A recent Stanford University study found more than 3,000 likely CSAM images embedded in LAION-5B, one of the largest publicly used image datasets. This means image-generating tools trained on such datasets could be vulnerable to being manipulated into creating illegal content.

According to the National Center for Missing & Exploited Children (NCMEC), the growth of AI-generated CSAM has been staggering. In just the first half of the year, nearly half a million cases were reported—compared to fewer than 70,000 during all of the previous year.

The PROACTIV AI Data Act would:

  • Direct the National Institute of Standards and Technology (NIST) to issue voluntary best practices for AI developers to screen training datasets for known CSAM;
  • Direct the National Science Foundation to support research into new technologies for detecting and removing CSAM from datasets;
  • And offer limited liability protection to companies that act in good faith and follow the best practices, protecting them from being penalized for inadvertently ingesting harmful content through automated data collection.

The bill builds on previous federal efforts to regulate AI and online safety and follows a growing chorus of concern about how emerging technologies are being coopted by bad actors. It also complements broader advocacy and prevention efforts to keep children safe from exploitation in both real and digital spaces.

If you or a loved one are a survivor of sexual abuse, SurvivorsRights.com may help connect you with an attorney who specializes in these types of cases. Fill out the brief, confidential form for a free case evaluation.

GET A FREE CASE EVALUATION
no pressure. No obligation.

Knowledge Sparks Reform for Survivors.
Share This Story With Your Network.

Learn how we helped 100 top brands gain success