Roblox Expands Facial Age Checks to Block Children From Chatting With Adult Strangers

A person holding a smartphone displaying the Roblox logo on the screen, representing online gaming and child safety concerns on the platform.
Summary: Roblox’s new facial age checks mark a major shift in how the platform handles child safety, arriving amid rising pressure from state attorneys general and national advocacy groups demanding stronger protections for young users. But for many families that have already been harmed By Roblox, the move is too little, too late.


Earlier today, BBC.com reported that Roblox stated that children will no longer be able to communicate with adult strangers unless they have completed a facial age verification process. The move arrives as lawmakers, parents, and survivor advocates intensify pressure on the platform over its long standing failures to protect young users from exploitation.

Roblox has been under scrutiny for years for allowing minors to encounter harmful content and communicate with adults in unsafe environments. Even Roblox’s own chief executive, Dave Baszucki, told the BBC earlier this year that parents who are concerned about safety “should not let their children be on Roblox,” a striking admission from the top of the company.

Child safety groups argue the dangers remain widespread. The National Society for the Prevention of Cruelty to Children said that young users continue to face unacceptable risks that leave them vulnerable to grooming and abuse. Advocates welcomed Roblox’s stricter rules but emphasized that the company must prove that these changes actually protect children in real life.

Roblox averaged more than 80 million daily players in 2024, with an estimated 40 percent of users under the age of 13. Because of that enormous child user base, regulators worldwide have pressured the company to adopt more aggressive safety controls. In the United States, there is no single federal law that comprehensively regulates online child safety, but states and federal agencies have begun increasing scrutiny on tech companies that expose minors to harm. Regulators and lawmakers are pushing platforms like Roblox to adopt stronger verification systems, making this update a significant shift toward greater accountability.

Roblox is also facing mounting legal trouble in the United States. Attorneys general in Texas, Kentucky, Florida and Louisiana have filed lawsuits or are launching Investigations, accusing the platform of failing to protect children from predators.

The company says it will require facial age verification before a user can access chat features, claiming that its age estimation technology can typically determine a user’s age within one to two years for people aged five to twenty five. The new policy will first roll out in Australia, New Zealand and the Netherlands in early December before going global in January.

Once verified, users will be placed into age brackets. Players will only be able to chat with others in similar age ranges unless they choose to add a person as a trusted connection, a feature intended for real life contacts. Children under 13 will continue to be blocked from private messaging unless a parent allows it.

This overhaul comes after repeated reports of adults contacting minors on the platform. A BBC test earlier this year showed that a 27 year old user and a 15 year old user could still message each other through loopholes in the system. Roblox responded by saying that abusers often try to move conversations to outside platforms to evade detection.

Roblox says facial verification images are processed through an external provider and deleted immediately once the age check is complete. Parents will retain the ability to manage accounts, including updating a child’s age after verification.

These updates arrive as advocacy groups ParentsTogether Action and UltraViolet hold what they describe as the first virtual protest inside Roblox. They will deliver a digital petition signed by more than 12,000 people urging Roblox to adopt stronger protections. Their message is clear: Roblox must stop functioning as a playground for predators.

For families, the new safety measures signal progress, but they also reinforce a deeper truth. Technology companies rarely change unless survivors and parents demand accountability.

Has Your Child Been Harmed By Roblox?

If you or your child experienced harm on Roblox, you may have legal rights. Our in depth Roblox Lawsuit Guide explains the latest lawsuits, safety concerns, and how survivors can pursue accountability.

Read the Roblox Lawsuit Guide.

You may also request a free case review by our intake department by filling out the secure, confidential form below.

Click here for more Roblox news coverage.

GET A FREE CASE EVALUATION
no pressure. No obligation.

Knowledge Sparks Reform for Survivors.
Share This Story With Your Network.

Learn how we helped 100 top brands gain success