
Grok AI tool used to generate child sexual abuse imagery
A child safety watchdog warns that AI tools like Grok are being used to create child sexual abuse imagery, raising urgent concerns about regulation, enforcement, and survivor protection.

A child safety watchdog warns that AI tools like Grok are being used to create child sexual abuse imagery, raising urgent concerns about regulation, enforcement, and survivor protection.
Yet another alarming Roblox cautionary tale: A California lawsuit alleges Roblox enabled the grooming of a 5-year-old boy by an adult posing as a child, leading to an attempted kidnapping near the child’s school.

A mother’s lawsuit describes how her son was allegedly targeted by an adult posing as a child on Roblox, and the claims reveal a scenario many parents never imagine until it is too late.

An Ohio mother is warning other families after her 15-year-old daughter encountered sexually-inappropriate content on Roblox, even with parental controls turned on. This adds to the controversy surrounding the platform, which now faces lawsuits in multiple states.

A California man who used the messaging platform Discord to sexually exploit a dozen underage girls has been sentenced to 14 years in federal prison,