The Australian government has initiated action regarding child safety concerns on the online gaming platform Roblox, prompted by reports of child grooming and exposure to harmful content. Communications Minister Anika Wells has requested an urgent meeting with Roblox and formally asked the Australian Classification Board to review the platform's PG rating. Concurrently, the eSafety Commissioner has informed Roblox of plans to verify its stated commitments to online child safety.
Government Initiatives and Regulatory Scrutiny
Communications Minister Anika Wells formally requested an urgent meeting with Roblox, citing concerns over reports of child grooming and exposure to sexually explicit and self-harm material on the platform. The Minister's letter highlighted "graphic and gratuitous user-generated content" and allegations of predators attempting to groom children. Wells also indicated that her department is exploring additional regulatory options for services like Roblox. She consulted with the eSafety Commissioner regarding potential short-term measures and asked the Classification Board to assess the continued appropriateness of Roblox's PG rating. Prime Minister Anthony Albanese underscored the government's commitment to child safety online.
Communications Minister Anika Wells formally requested an urgent meeting with Roblox, citing concerns over reports of child grooming and exposure to sexually explicit and self-harm material on the platform.
eSafety Commissioner's Actions
The eSafety Commissioner, Julie Inman Grant, has contacted Roblox, outlining intentions to verify the platform's adherence to child safety commitments. These commitments include the default disabling of chat features and implementation of private settings for underage accounts. The eSafety office stated it would test Roblox's compliance with measures such as default private settings for users under 16 and tools designed to prevent adult users from contacting children. The Commissioner noted ongoing concerns regarding child exploitation and exposure to harmful material. Following these compliance tests, the eSafety Commissioner indicated that further action under the Online Safety Act might be considered.
The eSafety Commissioner, Julie Inman Grant, has contacted Roblox, outlining intentions to verify the platform's adherence to child safety commitments.
Specific Concerns and Reports
Concerns raised by the government stem from various reports. These include media accounts of children accessing adult-oriented spaces within the game, exposure to sexually explicit and suicidal material, and reports of children being approached by predators. One report cited legal charges against an individual accused of grooming hundreds of children on various platforms, including Roblox. Past reports have also indicated the platform's alleged use by right-wing extremists and for Islamic State propaganda, as well as encouraging self-harm among young girls.
Concerns raised by the government stem from various reports, including media accounts of children accessing adult-oriented spaces within the game, exposure to sexually explicit and suicidal material, and reports of children being approached by predators.
Roblox's Stated Safety Measures and Commitments
Roblox has previously collaborated with the eSafety Commissioner on child safety measures. In the past, Roblox outlined commitments to align with Australia's Online Safety Act. These commitments included default private accounts for users under 16, and tools designed to prevent adults from contacting individuals under 16 without parental consent. Additionally, Roblox committed to default deactivation of direct and in-game chat features for children in Australia until age estimation is completed. The platform also prohibited voice chat between adults and 13-15 year olds, with a complete prohibition for users under 13. Parental controls enabling the disabling of chat for 13-15 year old users were also part of their commitments, as was age verification for users 17 years or older to access certain games.
Since November, Roblox has implemented age verification controls utilizing facial recognition, which it states aims to restrict chat functions to users of similar ages.
The company asserted its position as the first large online gaming platform to require facial recognition age checks for all users and stated it maintains "robust safety policies" and "advanced safeguards" to monitor for harmful content and communications.
Regulatory Context and Potential Enforcement
Roblox was previously exempted from Australia's under-16s social media ban, based on safety commitments made to the eSafety Commission. However, Minister Wells noted that the identified issues reportedly persist despite these measures. New regulations under the Online Safety Act, addressing age-restricted material such as pornography and self-harm content, will apply to Roblox from March 9. The Act also requires online gaming services to prohibit and take action against non-consensual sharing of intimate images, grooming, and sexual extortion. Platforms found non-compliant with the Online Safety Act can face fines of up to $49.5 million. Minister Wells also highlighted the importance of a "digital duty of care" to ensure digital platforms proactively keep users, particularly children, safe.
Platforms found non-compliant with the Online Safety Act can face fines of up to $49.5 million.
Platform Overview
Roblox is a popular online gaming platform that allows users to create their own mini-games. It is the most popular gaming application among Australian children aged four to 18. Globally, Roblox reported over 150 million daily users, while another report stated approximately 111 million daily users. An estimated 40 percent of its user base is under the age of 13, and Australia represents the platform's second-largest market.
Globally, Roblox reported over 150 million daily users, with an estimated 40 percent of its user base under the age of 13.