Roblox, the online gaming platform used by millions of children and adults worldwide, is now requiring mandatory age verification for users who want to access online chat features.
The new policy launched in early January and comes as the company faces a growing number of lawsuits alleging it failed to adequately protect children from online predators.
Roblox says users must complete a facial scan to estimate their age before gaining access to chat functions. According to the company, the system places users into age-appropriate default settings, including limits on who they can communicate with.
“So that’s a quick and easy process where we take a quick scan of your face, we estimate your age,” said Eliza Jacobs, senior director of product policy at Roblox.
Jacobs said the goal is to create safer, peer-based interactions for minors on the platform.
“Then, we’ll put you in the right set of defaults and settings to have an age-appropriate experience on the platform, which includes only chatting with folks that are in your peer group,” she said.
The move comes amid criticism and dozens of lawsuits accusing Roblox of not doing enough to prevent sexual exploitation of minors.
Attorney Matthew Dolman says his firm represents more than a thousand families and has filed dozens of lawsuits against Roblox. Dolman believes the change could have been implemented much sooner.
“Why didn’t they do this many years sooner? Why is safety now a concern?” Dolman said. “There was technology available to make this much, much safer for years now.”
Dolman and other attorneys have alleged in lawsuits the platform allowed predators to groom and exploit children.
“I mean, it’s systemic. It is in no way isolated,” Dolman said. “These families have gone through hell.”
NBC6 previously reported on the case of an 11-year-old South Florida girl who investigators say was sexually abused by a 19-year-old man she met through Roblox in 2022. The man, Anthony Borgesano, is now serving a 25-year prison sentence.
Dolman also represents the family of another 11-year-old girl who he says was groomed on the platform and coerced into sharing explicit images of herself. That civil case has not yet been filed.
In a statement shared through Dolman, the girl’s father reacted to the new safety measures, writing in part:
“…This is a publicly traded company that had every opportunity to implement stringent safety measures to protect children. All Roblox cared about is returning money to shareholders by allowing as many individuals onto their platform as possible, at the expense of children’s safety.”
Jacobs said the company sympathizes deeply with affected families.
“Even one case like that is one too many, and our hearts go out to those families that are dealing with that,” she said.
Roblox maintains it has long used a multi-layered safety system, including filtered chats, and notes that image sharing is not allowed on the platform.
The company says facial age-verification technology was not accurate enough until recently.
“As this technology became available and accurate enough to use in this way, we wanted to utilize it,” Jacobs said.
NBC6 also found dozens of videos online showing users attempting to bypass Roblox’s age-verification system. Jacobs acknowledged that no system is foolproof.
“No system is perfect, but we’re rolling this out now because we are confident in its accuracy,” she said.
Jacobs said Roblox has millions of daily users and emphasized that biometric data collected during age verification is immediately deleted after the scan.
The civil lawsuit involving Borgesano remains pending.

Want more insights? Join Working Title - our career elevating newsletter and get the future of work delivered weekly.