Roblox Rolls Out Age Verification and Universal Content Ratings

▼ Summary
– Roblox is expanding its age-estimation technology to all users and partnering with the International Age Rating Coalition to provide age and content ratings.
– The age-estimation system involves scanning users’ selfies and analyzing facial features, combined with ID verification and parental consent for accuracy.
– The partnership with IARC will replace Roblox’s own content labels with ratings from agencies like ESRB in the U.S. and PEGI in Europe, based on factors like violence or adult language.
– These updates follow earlier safety features, including age verification to restrict chat access for users under 13 and limit trusted connections for teens.
– Despite these efforts, lawsuits and reports indicate that child predators and inappropriate content still exist on the platform, raising ongoing safety concerns.
In response to growing concerns over child safety, Roblox is implementing a major expansion of its age-verification technology and introducing globally recognized content ratings across its platform. The gaming service announced these updates on Wednesday, aiming to provide parents with clearer insights and stronger protections for younger users.
By the end of the year, all Roblox users who engage with communication features such as voice and text chat will be required to complete an age-estimation process. This involves scanning a selfie, which the system analyzes to approximate the user’s age. The company emphasizes that this method is far more reliable than simply asking users to enter a birth date during account creation.
To further improve accuracy, Roblox combines facial estimation with ID verification and parental consent mechanisms. These layered approaches help ensure that age-based restrictions are applied correctly. The platform also plans to introduce additional safeguards to limit interactions between adults and minors.
In a parallel effort, Roblox is partnering with the International Age Rating Coalition (IARC) to adopt standardized content ratings. This means games on the platform will no longer use Roblox’s internal labels but will instead display ratings from established regional authorities. For example, users in the U.S. will see ESRB ratings, while those in Germany will see classifications from the USK, and U.K. and European players will encounter PEGI ratings.
These ratings are designed to inform parents about potentially concerning content, such as violence, language, gambling references, or mature themes. The goal is to offer greater transparency and help families make more informed decisions about which games are suitable for children.
These updates build on safety enhancements introduced earlier this year. In July, Roblox rolled out an age-verification system that uses video selfies to restrict access to certain features. Users under 13, for instance, cannot use unfiltered chat functions, while those between 13 and 17 face limitations on adding unknown users to their trusted connections.
The move also aligns with a global trend toward stricter online safety regulations. Laws like the U.K.’s Online Safety Act and Mississippi’s age-assurance legislation are pushing platforms to adopt more rigorous age-verification methods. Several other states, including Arizona and Virginia, are considering similar measures.
Roblox has invested significantly in safety tools over the years. These include Roblox Sentinel, an open-source AI system designed to detect signs of child endangerment, along with parental controls, communication restrictions, and automated moderation that identifies servers where rules are being violated.
Despite these efforts, challenges remain. Lawsuits filed in states like Louisiana, California, and Texas allege that predators still find ways to target children on the platform. A recent study highlighted in The Guardian also found that inappropriate content and harmful interactions continue to occur.
In another concerning incident, a popular game called Grow a Garden drew attention when players began trading virtual items for real money, violating platform rules and raising alarms about financial exploitation of young users.
While the new rating system may not eliminate all risks, it represents a meaningful step toward greater accountability and user protection. Matt Kaufman, Roblox’s chief safety officer, stated, “We’re committed to creating a safe platform and empowering parents to make the best decisions for their children. Our partnership with IARC will provide families worldwide with more clarity and confidence regarding age-appropriate content.”
(Source: TechCrunch)

