Just days after launching, Roblox’s much-hyped AI-powered age verification system is a complete mess.
Roblox’s face scanning system, which estimates peoples’ ages before they can access the platform’s chat functions, rolled out in the US and other countries around the world last week, after initially launching in a few locations in December. Roblox says it is implementing the system to allow users to safely chat with users of similar ages.
But players are already in revolt because they can no longer chat to their friends, developers are demanding Roblox roll back the update, and crucially, experts say that not only is the AI mis-aging young players as adults and vice versa, the system does little to help address the problem it was designed to tackle: the flood of predators using the platform to groom young children.
In fact, WIRED has found multiple examples of people advertising age-verified accounts for minors as young as 9 years old on eBay for as little as $4.
After WIRED flagged the listings, eBay spokesperson Maddy Martinez said the company was removing them for violating the site’s policies.
In an email, Roblox’s chief safety officer Matt Kaufman told WIRED that a change of this magnitude on a platform with over 150 million daily users takes time.
“You can’t flip a switch while building something that hasn’t existed before,” he said. “Expecting the system to be flawless overnight is ignoring the scale of this undertaking.”
Kaufman said the company was happy with the uptake, adding that “tens of millions of users” have already verified their age, which he claimed proved that “the vast majority of our community values a safer, more age-appropriate environment.”
The company also addressed some of the criticism in an update on Friday, writing: “We are aware of instances where parents age check on behalf of their children leading to kids being aged to 21+. We are working on solutions to address this and we’ll share more here soon.”
Roblox announced the age verification requirement last July as part of a raft of new features designed to make the platform safer. The company has come under intense pressure in recent months after multiple lawsuits allege the company failed to protect its youngest users and facilitated predators to groom children.
The attorneys general of Louisiana, Texas, and Kentucky also filed lawsuits against the company last year making similar claims, while Florida’s attorney general issued criminal subpoenas to assess whether Roblox is “aiding predators in accessing and harming children.”
Roblox claims that requiring people to verify their ages before allowing them to chat to others will prevent adults from being able to freely interact with children they don’t know.
While the process is optional, refusing to do it means a person will no longer have access to the platform’s chat functions, one of the key reasons most people use Roblox.
To verify their ages, people are asked to take a short video using their device’s camera, which is processed by a company called Persona that estimates their age. Alternatively, users can upload a government-issued photo ID if they are 13 or older.
Roblox says all personal information is “deleted immediately after processing.” However many users online say they are unwilling to conduct age verification over privacy concerns.
Peope who have verified their ages are only allowed to chat to a small group of other players around their own age. For example, those verified as under 9 can only chat with players up to the age of 13. Players deemed to be 16 can chat with players between 13 and 20.