Roblox Rolls Out Mandatory Age Checks for All Chat Users
Roblox has begun its highly anticipated mandatory age verification rollout, requiring all users who access communication features on the platform to complete facial analysis, identity checks, and parental consent by the end of 2025. The enforcement marks one of the most significant child safety reforms in the gaming platform’s history.
The rollout begins today with a voluntary age check period, gradually transitioning to mandatory requirements across the platform’s 70 million daily users.
Three-Layer Verification System
Roblox’s age verification approach combines facial age estimation, government ID checks, and parental consent for minors. Users accessing text or voice chat will need to submit a selfie for AI-powered analysis, which categorizes them into three age brackets: under 13, 13 and older, or 18 and older.
The technology, powered by third-party verification provider Persona, scans video selfies and compares facial features against a diverse dataset to estimate age. For users placed in the under 13 age group, certain personal data including email and phone number will be removed from Roblox.
Enhanced Safety Protections
A key safeguard prevents communication between adults and minors unless both parties can verify they know each other in real life. Roblox will also enforce restrictions that limit users to content and features deemed appropriate for their verified age group.
These new measures expand on existing safety tools. Earlier this year, Roblox introduced Trusted Connections, which adds age-based controls to friend requests, and Roblox Sentinel, an AI system that flags potentially harmful interactions. The company is also working with the International Age Rating Coalition to implement standardized content ratings from agencies such as ESRB and PEGI.
Responding to Regulatory Pressure
The announcement comes amid mounting scrutiny of online platforms’ child safety practices. Lawsuits filed by state attorneys general, including in Louisiana, have accused Roblox of failing to adequately protect children, intensifying pressure for reform. Globally, stricter requirements are emerging, with the United Kingdom’s Online Safety Act setting new standards.
Matt Kaufman, Roblox’s Chief Safety Officer, stated: “We’re taking this step as part of our long-term vision as a platform for all ages. We expect that our approach to communication safety will become best practice for other online platforms.”
Privacy and Technical Concerns
While Roblox argues that its facial estimation technology is accurate across age groups, including teenagers, researchers caution that facial estimation systems can be prone to error, bias, or spoofing. Questions remain about data security and whether independent oversight will verify the company’s reliability claims.
Despite these concerns, Roblox maintains that the multilayered approach is critical for improving safety across its massive user base, positioning itself as a leader in online platform compliance and safety standards.