Roblox is preparing to introduce an AI-powered facial age estimation system this month as the company faces a growing wave of lawsuits from states and families who say the gaming platform falsely marketed itself as safe for minors. The new system will require users to verify their age before they can access any form of communication on the platform, a change Roblox says is designed to reduce the risk of adults misrepresenting themselves to children, Raleigh, NC-based WRAL reported yesterday.
“That will be required to engage in any kind of communications on the platform,” said Eliza Jacobs, Senior Director of Product Policy at Roblox. “So, until you use facial age estimation, you will not have access to chat anywhere on the platform.” Jacobs told WRAL Investigates that the company plans to launch the feature on November 18 in several regions including Australia before rolling it out in the United States.
A plaintiff’s attorney whose firm represents thousands of families nationwide said the company’s decision reflects increasing pressure to address safety issues. “I think Roblox is starting to get the message, in that they’re putting up hundreds of millions of dollars most recently into adding safeguards,” he said. One of the families his firm represents is from Wake County and alleges that their teenage daughter was contacted by an adult predator while playing Roblox at age 13. The lawsuit claims the communication then moved to Discord where the predator coerced the teen into sending sexually explicit content.
Jacobs addressed these cases directly. “Even one of these cases is too many and our hearts go out to the families who have been affected in this way. It is unacceptable and we are working around the clock to try and prevent this as much as we can.”
Roblox has added several safety features in recent years. In November 2024 the company launched parent accounts that can be linked to a child’s existing or new Roblox profile. WRAL Investigates tested the tool and found the setup process straightforward. Parents verify their identity through an ID or credit card and can then manage the maturity level of experiences their child can access, review their child’s connections, and control access to chat. Lawsuits claim that chat functions are frequently exploited by predators to build trust and deceive minors.
Jacobs said Roblox is pushing to raise awareness of these parental features. “We have done lots of communications about this, but we obviously need to do more to really get the word out. We want parents to be engaged with our parental controls and the systems we have to keep their kids safe,” she said, noting that the company’s approach is guided by a “safe by default” strategy.
Despite the upcoming safety tools and expanded parental controls, some advocates remain skeptical. “Until all of those measures are foolproof, tested and retested, I don’t think Roblox is safe to use for any minor,” said the plaintiff’s attorney. Jacobs responded by reaffirming Roblox’s ongoing efforts. “We are building systems every day to make [Roblox] even safer.”
Roblox’s rollout of age estimation technology highlights a broader shift in how the company responds to the increase in legal claims. The new lawsuits allege that Roblox failed to adequately protect minors and misled families about the risks children face on the platform. As the company prepares to introduce its most significant safety update to date, the litigation continues to highlight troubling patterns of abuse facilitated through online gaming environments.
Concerned about your child’s safety on Roblox? Learn more about ongoing lawsuits and how families are pursuing justice. Visit our Roblox Lawsuit Guide and fill out the confidential, secure form below for a free case review.
Interested in more Roblox news coverage? Click here for more articles.



