Concerns over child safety are stopping Disney from collaborating with Roblox, a new report has revealed.
With 151 million daily active users, Roblox is one of the biggest games in the world — if not the biggest. And increasingly, access to this audience is seen as a lucrative and highly accessible method of marketing everything from big-name brands to Hollywood blockbusters.
Still, worries over how safe Roblox is for its enormous audience of players — many of whom are under the age of 18 — remain offputting to Disney, a new Variety report has highlighted.
While some had suspected Disney was avoiding Roblox due to its $1.5 billion investment in rival gaming platform Fortnite, sources told Variety that Disney was instead avoiding Roblox specifically because the company did not believe the larger platform was safe at this time.
Disney content is of course deeply embedded within Fortnite, which regularly hosts Star Wars and Marvel crossovers, recently featured a well-received The Simpsons mini-season, and now includes a small army of licensed skins for everyone from Maleficent to the Mandalorian. A specific Disney Fortnite mode is also in development.
For now at least, Roblox fans shouldn’t expect anything similar, the report continues. That’s despite Roblox’s far larger audience (Fortnite averages between 30 and 40 million daily users) and the platform playing host to other large brands, such as Sonic the Hedgehog and Squid Game.
Following signficant, sustained criticism of its player safety protocols and amid multiple lawsuits, Roblox has added a series of stricter requirements designed to limit who its young audience can interact with. Most recently, the platform added facial age verification to limit communication features in select countries, with a U.S. rollout to follow. But this too has been criticized as something of a band-aid.
“The issue is basically the ability of younger players to cheat those systems — to pretend to be older, to use the older siblings, to use facial identity to get into those systems,” Ron Kerb, CEO at child safety platform Kidas, told Variety. “And we know that it’s happening on TikTok, we know that it’s happening on gaming platforms, on Roblox. Platforms are trying to block kids, and kids are going to find ways to open it up.”
In Roblox’s most recent Safety Snapshot, the company said it “continued to innovate around safety,” and revealed it had open-sourced a version of Roblox PII Classifier, which “has significantly enhanced” its ability to detect and block attempts to violate its policies around sharing personally identifiable information (PII).
Image credit: Katelyn Mulcahy/Getty.
Tom Phillips is IGN’s News Editor. You can reach Tom at [email protected] or find him on Bluesky @tomphillipseg.bsky.social