At a Glance
- Roblox’s new AI face-scan age verification launched globally last week and is already misfiring, mis-aging players and locking them out of chat.
- Verified accounts for children as young as 9 are being sold on eBay for $4, undermining the safety system.
- Players and developers are demanding a rollback, with only 50% of users completing verification in early test markets.
- Why it matters: The feature was meant to stop predators from grooming kids, but errors and black-market accounts may make the problem worse.
Roblox’s global rollout of an AI-powered age-verification system is in crisis just days after launch, with players locked out of chat, developers in open revolt, and black-market listings selling kids’ verified accounts for pocket change.
The platform now requires users to scan their face or upload government ID before they can access chat features. While optional, refusal bars access to one of Roblox’s core functions. The system, built by third-party provider Persona, estimates age from a short video selfie and sorts users into restricted chat pools. A player verified as under 9 can talk only to others up to age 13, while a 16-year-old can reach users aged 13-20.
System Errors and Backlash
Since the worldwide launch, social media, Reddit, and developer forums have filled with complaints. Multiple players report being mis-aged-children flagged as adults and adults as children-cutting them off from friends. Developers are publicly calling on Roblox to reverse the update, citing lost engagement and user frustration.
In a Friday status post, Roblox acknowledged that parents are verifying on behalf of children, resulting in kids being tagged as 21+, but offered no immediate fix beyond “working on solutions.”
Black-Market Loophole
News Of Fort Worth‘s investigation found eBay listings advertising age-verified Roblox accounts for minors as young as 9 priced at $4 each. After News Of Fort Worth flagged the listings, eBay spokesperson Maddy Martinez said they were removed for violating site policies. The existence of such listings undercuts Roblox’s claim that age verification will prevent adults from posing as children.

Origins and Legal Pressure
Roblox announced the requirement last July as part of a broader safety push. The move followed multiple lawsuits and state actions:
- Attorneys general in Louisiana, Texas, and Kentucky sued the company, alleging it failed to protect young users and facilitated grooming.
- Florida‘s attorney general issued criminal subpoenas to determine whether Roblox is “aiding predators in accessing and harming children.”
The company says verified age gates will stop unknown adults from chatting with minors, yet early data show adoption problems. In the November pilot across Australia, New Zealand, and the Netherlands, only 50% of players completed verification.
Privacy and Participation Concerns
Roblox states that all personal data are “deleted immediately after processing,” but privacy worries persist. Many users refuse verification on principle, effectively losing chat access. Critics argue the system creates a new commodity-verified child accounts-while doing little to deter determined predators who can buy or borrow credentials.
Key Takeaways
- Technical flaws are blocking legitimate users and misclassifying ages.
- Verified child accounts are already for sale online, eroding safety gains.
- Low uptake in pilot regions signals potential long-term engagement issues.
- Legal and public pressure drove the rapid rollout, but execution gaps may expose Roblox to further backlash.

