Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.
Photo by Hannah Peters and Getty Images

Roblox tried a band-aid fix for its child predator problem, but it’s so broken that the platform has descended into total chaos

An absolute trainwreck.

Roblox recently rolled out a mandatory age-verification system intended to be a shield against child predators, but less than a week in, the whole thing is completely broken and has devolved into total chaos. The main goal of this update was to classify users correctly so they could only chat with groups around their age. Unfortunately, the system is failing spectacularly, classifying children as adults and adults as children, as per Wired.

Recommended Videos

One report noted a 23-year-old user was incorrectly identified as being between 16 and 17 years old. As that user put it, “I don’t want to be chatting with f—ing children.” Similarly, an 18-year-old was reportedly lumped into the 13 to 15 age bracket.

The mandatory verification process requires users to either submit a quick selfie for a facial age estimate or, if they are 13 or older, they can optionally upload a government ID for a more formal check. Reports of predators using the service to groom young children have been growing. Several states, including Louisiana, Texas, and Kentucky, have filed lawsuits against the company. Florida’s attorney general has even issued criminal subpoenas, showing just how high the stakes are.

The platform made this change because the pressure had become immense

Given the severity of the allegations and the legal action, it’s fair to say Roblox’s long-term survival hinges on whether it can successfully tackle this safety issue. Right now, it’s not off to a hot start, especially because the problems aren’t just affecting adults. Kids are also figuring out how to completely game the system to get adult classifications. Online videos are popping up everywhere showing children easily spoofing the verification process.

One clever kid drew wrinkles and stubble on his face using a marker and was instantly deemed 21 or older by the AI. Another user simply flashed a photo of the late musician Kurt Cobain and was granted the adult classification. This level of easy circumvention is a massive vulnerability, and it means the age-gated chat feature is essentially useless if a determined child wants to bypass it.

Compounding the problem, Roblox recently acknowledged that some parents are submitting age checks on behalf of their kids. When this happens, the child ends up being placed in the 21+ category. This is a huge oversight, but the company says it’s aware of the issue. They announced they are “working on solutions to address” that specific problem and plan to share an update soon.

Meanwhile, the people who actually build games on the platform are furious. The developer forums are reportedly flooded with thousands of negative comments about the update, with many devs demanding that the entire change be reversed. One developer shared data showing that the percentage of users engaging in chat plummeted from around 90 percent down to a shocking 36.5 percent. And this has nothing to do with Russia’s ban on the platform.

Developers are already describing their games as feeling “lifeless” or like “a total ghost town.”


Attack of the Fanboy is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author