Roblox Bans Adult Strangers From Chatting With Kids! Here's Why...

Roblox Bans Adult Strangers From Chatting With Kids! Here's Why...

Roblox, the popular online gaming platform, is taking significant steps to protect its younger users. Facing a string of lawsuits alleging the platform's design made children vulnerable to predators, Roblox will soon implement age checks for communication. This means children will be blocked from chatting with adult strangers and much older teens they don't know in real life.

The new system will use facial age estimation technology to categorize users into age groups: under nine, nine to 12, 13 to 15, 16 to 17, 18 to 20, and 21 and over. Children will only be able to communicate with others within their age group. Roblox claims to be the first online gaming or communication platform to require age checks for communication, setting a new safety standard for the industry.

This initiative will be rolled out in phases, starting in Australia, New Zealand, and the Netherlands next month, with a global expansion planned for early January. The company emphasizes that privacy is a priority; images and videos used for facial age estimation will be deleted immediately after processing.

Matt Kaufman, Chief Safety Officer at Roblox, believes this move will establish a new gold standard for communication safety. The company hopes that by limiting interaction between minors and adults, they can create a safer and more age-appropriate environment for all users. This represents a significant shift in how online platforms approach age assurance, moving beyond simple self-declared age to more advanced technological methods.

Why is Roblox doing this?

The decision comes after a series of lawsuits alleging that Roblox's design made “children easy prey for paedophiles.” The platform, which boasts 150 million daily players, has faced increasing scrutiny over its safety measures. This new system aims to address these concerns and provide a more secure environment for young users.

How will it work?

  • Facial age estimation technology will categorize users into age groups.
  • Children can only chat with others in their age group.
  • The system will be rolled out in phases, starting in select markets.
  • Privacy is a priority; images and videos are deleted immediately after processing.

This initiative represents a major step towards protecting children online and could pave the way for other platforms to adopt similar safety measures. It remains to be seen how effectively this system will be implemented and enforced, but it signals a commitment from Roblox to prioritize the safety and well-being of its young users.