Share this @internewscast.com

Roblox, a platform enjoyed by approximately 80 million users every month, is under increasing pressure to enhance safety measures for its young audience. A significant number of these users are children, which has prompted a closer examination of the platform’s safety protocols.
In a bid to create a safer environment, Roblox has implemented new security measures requiring users to either submit a photo with their government-issued ID or a selfie for AI-powered “age estimation” to access the chat feature. This decision follows a series of lawsuits alleging that the sandbox game has become a potentially hazardous space, putting children at risk of being targeted by predatory adults.
As one of the world’s most beloved games for children, Roblox allows players to navigate various user-generated worlds and game modes through customizable avatars. The company, aiming to address the safety concerns, announced these policy changes earlier this year.
The new verification system will initially roll out in Australia, New Zealand, and the Netherlands. There are plans to extend this requirement to additional regions, including the United States, in the upcoming year.
To comply with the verification process, users will need to upload a photograph of themselves alongside a government-issued ID, such as a passport or driver’s license, ensuring a more secure and trusted community on the platform.
To submit verification, a user must upload a photo of themselves along with their government-issued ID, such as a passport or driver’s license.
But unlike most social media apps, which have a minimum age of 13, Roblox allows younger children to use the platform. Since most children and young teens don’t have an ID, they will need to submit a selfie to the platform for verification using a third-party AI scan.
Roblox says it has partnered with Persona, a data collection service, to verify users using an AI system.
“We estimate your age by analyzing a selfie of your face and examining your facial features,” the company explained on its website. “Your estimated age helps place you in the appropriate age group (under 13, 13+ and 18+) to customize your experience on Roblox. If you are placed in the under-13 age group based on facial age estimation, certain personal data, including your email and phone number, will be removed from Roblox.”
The new chat system being rolled out with the verification requirement is “age-based,” meaning it will limit users’ ability to interact with people outside their age group.
Roblox didn’t provide details on how accurate its age estimation features are, but Engadget reports that the company’s Chief Safety Officer, Matt Kaufman, called the verification “pretty accurate.”
“What we find is that the algorithms between that 5 and 25 years old (range) are typically pretty accurate within one or two years of their age,” Kaufman reportedly said during a briefing with reporters.
Roblox has come under increased scrutiny from lawmakers and law enforcement officials in recent years, partly because it allows young users to create accounts and interact with others.
Attorneys General from Louisiana and Kentucky filed separate lawsuits against the company, accusing Roblox of harming children earlier this year. Florida’s Attorney General has filed a criminal subpoena for information about the company, calling the platform a “breeding ground for predators.”
Alongside the increased political pressure, Roblox also faces criticism from advocates, parents and law enforcement officials.
The platform has, for years, been accused of allowing content not appropriate for children and endangering them. In April, NBC News reported that a California man had been arrested for kidnapping a 10-year-old boy he met on the platform.