Roblox, the popular online gaming platform with over 150 million daily active users, is implementing mandatory facial age estimation to restrict children from communicating with adult strangers in chats. The update, announced on November 18, 2025, aims to create age appropriate interactions and curb risks from predators, as the company faces a wave of lawsuits alleging failures in child safety.
Roblox just announced MASSIVE safety changes.
— KreekCraft (@KreekCraft) November 18, 2025
🚨 Kids will soon be blocked from chatting with adults and anyone outside of their “age groups”.
No in-game chat, no messages, etc. If you’re a kid, you’ll only be able to talk with kids around your age.
Source:
The Guardian pic.twitter.com/d6k8TVikbQ
How the New System Works
Starting today with a voluntary rollout, the age verification requirement will become mandatory for all chat access. Users must use the Roblox app’s camera for a quick facial scan or, for those over 13, upload a government ID to confirm their age group. The AI powered process, handled by third party vendor Persona, immediately deletes all images and videos after estimation no data is stored. Players will be sorted into six broad age bands:
- Under 9
- 9–12
- 13–15
- 16–17
- 18–20
- 21+
Chat will be limited to users in the same or adjacent groups, for instance, a 12 year old can only talk to those under 16. Children under 9 will have in game chat disabled by default unless parents consent via linked accounts. Exceptions exist for “trusted connections,” such as family members verified via QR codes or contacts.
The enforcement begins in early December in select markets like Australia, New Zealand, and the Netherlands, going global by early January 2026.
Roblox claims this makes it the first major gaming or communication platform to require such checks for chat, positioning it as a “gold standard” for safety.

Backdrop of Lawsuits and Safety Scrutiny
The timing aligns with escalating legal pressures. Roblox faces suits from attorneys general in Texas, Kentucky, and Louisiana, plus private claims, accusing it of enabling grooming, abuse, and exploitation of children as young as seven. One recent Nevada case details a predator targeting a 13 year old girl, coercing explicit content after building trust on the platform. Florida issued a criminal subpoena, and critics like lawyer Matt Dolman have filed dozens of cases since the pandemic.
Roblox defends its existing measures strict chat filters for under-13s, bans on image/video sharing, personal info blocks, and law enforcement collaboration but argues the new verification adds a critical layer.
Expert and Community Reactions
Safety advocates praise the move. Jules Polonetsky of the Future of Privacy Forum called it a “privacy-preserving” advancement, while Ofcom’s Anna Lucas welcomed platforms stepping up.
On X, reactions are mixed, some hail the protections. Still, others worry about AI inaccuracies (e.g., adults misclassified as minors), privacy despite assurances, and impacts on roleplay games or family play. Roblox has clarified that already ID verified users skip the scan, that custom matchmaking groups similar ages on servers, and that all chat (including in game signs) routes through filtered services.
This overhaul builds on Roblox’s prior safety expansions, like September’s age estimation plans and July’s trusted connections, signaling a broader push amid global regulations like Australia’s under 16 social media ban.