
Roblox tightens age verification to protect young users
Roblox tightens age verification to protect young users
- Roblox will implement a new age verification system requiring users to complete facial checks to access chat features.
- The initiative comes as part of Roblox's effort to enhance child safety following multiple lawsuits and criticism of its policies.
- These changes aim to foster a safer environment for young users, enabling them to interact only with peers in their age group.
Story
Roblox, a popular gaming platform, is enhancing its user safety protocols with a stricter age verification system, particularly for chat features. This measure is set to roll out first in Australia, New Zealand, and the Netherlands in December 2025, followed by a global implementation in January 2026. The new system is prompted by growing concerns over child safety on the platform and the push from various states and countries towards stronger age verification regulations. Users will be required to undergo facial age estimation checks by submitting a video selfie, which is claimed to be a secure method that respects privacy and is destroyed post-verification. Additionally, users will be categorized into age groups to limit communication with significantly older individuals, aiming to create a safer interaction environment. Roblox aims to comply with these regulations and respond to criticism surrounding its child safety record, amidst numerous lawsuits alleging the platform facilitates child exploitation. Experts emphasize the importance of effective safety measures, and Roblox asserts its commitment to ongoing improvements and rigorous monitoring of platform content. While welcoming the new measures, child safety organizations call for further action to ensure that safety protocols effectively mitigate risks posed by adult users to minors on the platform.