Australian Government Seeks Urgent Meeting with Roblox, Requests Content Rating Review
The Australian government has formally requested an urgent meeting with the online gaming platform Roblox and asked the national Classification Board to review its content rating. This follows media reports and ongoing regulatory concerns regarding child safety, including exposure to inappropriate user-generated content and potential grooming risks on the platform.
Government Actions and Concerns
Communications Minister Anika Wells has written to Roblox seeking an urgent meeting. In her correspondence, she cited reports of children being exposed to "graphic and gratuitous user-generated content," which includes sexually explicit and suicide-related material. The Minister also referenced concerns about predators using the platform to approach and groom children.
Separately, Minister Wells has written to the Australian Classification Board to request a review of the appropriateness of Roblox's current PG (Parental Guidance) rating.
The Minister cited reports of children being exposed to "graphic and gratuitous user-generated content," which includes sexually explicit and suicide-related material.
Regulatory Scrutiny from eSafety Commissioner
The eSafety Commissioner, Julie Inman Grant, has informed Roblox of its intention to conduct compliance testing. This testing aims to verify the implementation of safety commitments the platform made in 2024.
According to government statements, these commitments include:
- Setting accounts for users under 16 to private by default.
- Implementing tools to prevent adult users from contacting minors without parental consent.
- Default deactivation of direct and in-game chat features for children in Australia until age estimation is completed.
- Prohibiting voice chat between adults and users aged 13-15, and completely for users under 13.
- Introducing age verification for users 17 and older to access certain games.
The eSafety Commissioner stated that ongoing reports of child exploitation and exposure to harmful material have sustained regulatory concern. Depending on the outcomes of the compliance tests, further action under the Online Safety Act may be considered. Platforms found non-compliant with this act can face fines of up to $49.5 million.
Platform Context and Previous Measures
Roblox is an online platform that allows users to create and play a wide variety of user-generated games. The company states it has over 150 million daily active users globally, with Australia being its second-largest market. An estimated 40% of its global user base is under the age of 13.
The platform was exempted from Australia's social media ban for users under 16, which was implemented in mid-December. This exemption was based on the safety commitments Roblox made to the eSafety Commissioner last year.
Roblox has stated it employs "robust safety policies" and describes itself as the first large online gaming platform to require facial recognition age checks for all users. These age verification controls, implemented since November, are designed to restrict chat functions to users of similar ages. The company has acknowledged reports of teens attempting to circumvent these tools.
Broader Regulatory Landscape
New industry codes under the Online Safety Act, which address age-restricted material, grooming, and sexual extortion, are scheduled to apply to services like Roblox starting March 9. Online gaming services are required to prohibit and take action against such activities.
Prime Minister Anthony Albanese has described the reports concerning Roblox as "horrendous" and stated the government will take necessary action based on the eSafety Commissioner's advice. Minister Wells has also referenced the need for a "digital duty of care," which would place responsibility on digital platforms to proactively keep users, particularly children, safe.