Roblox Now Demands a Selfie to Chat

Roblox Now Demands a Selfie to Chat - Professional coverage

According to Engadget, Roblox Corporation is now requiring age verification for any user who wants to use the in-game chat feature in all available regions. Starting January 7, players in the US and abroad will need to submit to facial age estimation via a selfie, while users 13 and older can opt for ID-based checks. The company says these features were trialed late last year in Australia, New Zealand, and the Netherlands, where half of all daily active users have already completed verification. The facial estimation is handled by a third-party vendor called Persona, and Roblox claims the images are deleted immediately after processing. Once verified, players are sorted into one of six age groups, from under 9 to 21+, and can generally only chat with their own group and the ones directly adjacent. Players under 9 will have chat disabled by default unless a parent approves it after the verification process.

Special Offer Banner

How the age gates work

So here’s the basic setup. You take a selfie, or show an ID if you’re older, and Roblox slots you into a bracket. The big deal is the communication wall it creates. If you’re verified as, say, 14, you’re basically in a bubble with 13-15 year olds. You can’t just randomly chat with a 17-year-old or a 10-year-old in a public server. Now, there is an escape hatch: the “Trusted Connections” feature for users 13+. That lets you chat more freely with people you’ve connected with via your phone contacts or by sharing QR codes outside of Roblox. But crucially, both parties still need to be age-verified to use it. It’s a layered system—public chat is heavily restricted by age, while private chat with known friends gets more leeway, but only after jumping through the initial verification hoop.

privacy-and-practical-trade-offs”>The privacy and practical trade-offs

Look, requiring a biometric selfie from kids is a massive ask, no matter how you slice it. Roblox is leaning hard on its promise that the images are processed by Persona and then instantly deleted. They have to say that. The trust factor here is enormous. And what about the practical side? The article mentions Roblox may ask for re-verification if your behavior doesn’t match your claimed age. That’s a fascinating, and frankly creepy, layer of moderation. It implies their systems are constantly judging whether you “sound” like a 12-year-old or a 16-year-old. How many false flags will that create? The intent to protect younger users from predators and harmful content is obviously good. But the method creates a huge database of verification events, even if the raw images are gone, and puts a significant burden of proof on the user. Is this the future of online safety for kids? Basically, trading a slice of privacy and anonymity for a promise of security.

A response to intense pressure

This isn’t happening in a vacuum. Roblox has been under fire for years, facing lawsuits from state attorneys general and constant criticism that its platform isn’t safe for the huge number of children who play it. Last year they restricted under-13 accounts from certain content and DMs. This new chat verification is the next, much more aggressive, step. It’s a direct response to legal and reputational threats. The company is trying to build a “walled garden” where interactions are tightly controlled by verified age. But here’s the thing: does this actually solve the problem, or just move it? Determined bad actors will find ways to bypass verification, or exploit the “Trusted Connections” feature. And for the vast majority of legitimate users, it’s another friction point. It’s a high-stakes bet that users will accept this intrusion as the cost of admission for a safer community. I think we’re about to find out just how much friction a dominant platform can introduce before it starts pushing people away.

Leave a Reply

Your email address will not be published. Required fields are marked *