Australia’s social media ban for under-16s, the first law of its kind globally, has led to millions of account deactivations but faces enforcement challenges.
The World’s First Comprehensive Ban
Australia’s Online Safety Amendment Act, effective December 10, 2025, prohibits individuals under 16 from holding or creating accounts on major social media platforms. The law designates platforms including Facebook, Instagram, Snapchat, Threads, TikTok, Twitch, X (formerly Twitter), YouTube, Kick, and Reddit as "age-restricted platforms," with exemptions for platforms primarily used for gaming, health, and education.
Social media companies face fines of up to A$49.5 million (approximately US$33 million) for failing to take "reasonable steps" to comply. There are no penalties for parents or children who bypass the restrictions.
Platforms must offer diverse age verification methods—including government-issued identification, facial age estimation, or age inference—and at least one alternative to formal ID submission must be available.
Initial Compliance: Millions of Accounts Deactivated
Account Deactivations by Platform
Within the first two days of the ban, the Australian government reported that over 4.7 million accounts across 10 platforms were deactivated, removed, or restricted. The eSafety Commissioner clarified this total includes historical, inactive, and duplicate accounts.
- Meta deactivated 544,052 accounts across its platforms between December 4 and December 11:
- Instagram: 330,639 accounts
- Facebook: 173,497 accounts
- Threads: 39,916 accounts
- Snapchat reported locking or disabling over 415,000 accounts identified as belonging to users under 16 by the end of January.
Industry Reactions
Meta stated ongoing compliance is "a multi-layered process" and advocated for age verification at the app store level with parental approval for under-16s, arguing this would ensure consistent industry-wide protections.
YouTube expressed concerns that the law reduces child safety by removing parental controls, stating parents will lose the ability to supervise children's accounts and manage content settings. The platform confirmed its intention to comply.
Reddit initiated a legal challenge against the Australian government, contending the ban is inefficient and restricts young people's freedom of speech.
Several platforms not initially included in the ban—such as Lemon8 and Yope—have been requested by the eSafety Commissioner to self-assess their status. Yope's CEO stated the company determined it is not a social media platform, while Lemon8 committed to excluding users under 16.
Four Months Later: Compliance Concerns Emerge
eSafety Commissioner’s Compliance Report
The first detailed compliance report, released four months after implementation, indicated social media companies had taken "some steps" toward compliance but identified "compliance concerns" in four primary areas:
- Messaging on some platforms encouraged under-16s to attempt age assurance despite self-declaration of being underage.
- Some platforms allowed under-16s to repeatedly try the same age-assurance method until successful.
- Pathways for reporting age-restricted accounts were generally inaccessible or ineffective for parents.
- Certain platforms appeared not to have adequately prevented under-16s from maintaining accounts.
Ongoing Investigations
The eSafety Commissioner is currently investigating Facebook, Instagram, Snapchat, TikTok, and YouTube for potential non-compliance. Decisions on enforcement actions, including potential fines, are expected by mid-2026.
A survey of 900 Australian parents indicated that approximately 31% of children still possessed at least one social media account after the ban, a decrease from 49% prior to the legislation. Of the under-16s who had accounts on Instagram, Snapchat, and TikTok before the ban, 70% maintained access. The most common reason cited was that platforms had not requested age verification.
Age Verification Technology Limitations
The Age Assurance Technology Trial report, released before the ban, concluded that age assurance could be achieved. However, the compliance report noted that "facial age estimation is known to have higher error rates for children near the age threshold of 16 years," potentially yielding false over-16 results for 14 or 15-year-olds.
Snapchat identified "significant gaps" in implementation, citing technical limitations in achieving accurate and reliable age verification. The eSafety Commissioner observed that Snapchat's facial age estimation lacked a "liveness test" to confirm the image's authenticity.
Government Response
Communications Minister Anika Wells characterized the deactivation figures as a "huge achievement" while acknowledging the ban's implementation would not be immediately perfect. However, she later stated that social media companies are providing the "absolute bare minimum" and engaging in obfuscation regarding regulation. She stated that if systemic non-compliance is found, the commissioner should impose penalties.
Prime Minister Anthony Albanese described the initial compliance data as "encouraging," noting "meaningful effort" by social media companies.
Legislative Expansion
A week prior to the compliance report, the Australian government registered a new legislative rule expanding the definition of social media platforms to include those with "addictive or otherwise harmful design features":
- Infinite scroll: Provides endless content
- Feedback features: Displaying "likes" or "upvotes"
- Time-limited features: Disappearing "stories"
This rule change occurred in the same week a US jury found Meta and Google liable for harms to children using their services.
Broader Online Safety Measures: Age Verification for Adult Content
On March 9, 2026, Australia implemented new Age-Restricted Material Codes requiring age verification for access to online pornography, violent material, and content related to self-harm, suicide, and disordered eating.
Requirements include:
- Pornography websites: Must ask users to confirm their age
- Search engines: For logged-out users, explicit search results will be blurred by default
- App stores: Must prevent users under 18 from purchasing R18+ apps
- AI companion chatbots: Must confirm users are 18 or older before generating explicit material
Platforms that do not comply face fines of up to A$49.5 million per breach.
Industry Response to Adult Content Measures
Aylo, the parent company of Pornhub, RedTube, YouPorn, and Tube8, blocked Australian users citing privacy concerns. The company stated that Australia's approach mirrors the UK's, which Aylo claims does not effectively protect minors and creates data privacy harms. Following implementation, a surge in Virtual Private Network (VPN) downloads was observed in Australia.
International Context
Several countries have implemented or are considering similar measures:
- Spain: Announced plans to ban social media for individuals under 16
- France: Approved a bill banning social media for children under 15, effective September 2026
- Indonesia: Announced a ban on social media for children under 16, beginning March 2026
- Brazil: Enacted a law requiring children under 16 to link social media accounts to a legal guardian
- Malaysia: Plans to ban children under 16 from social media in 2026
- Denmark: Introduced legislation to ban social media for users under 15
- United Kingdom: Considering banning young teenagers from social media
- United States: The Kids Online Safety Act (KOSA) passed the Senate in 2024 but has not advanced further
Unanswered Questions
The compliance report did not specify how many under-16s remain on platforms or how many new accounts children created since the ban. There is no estimate of under-16s using alternative platforms.
Key questions remain:
- The definition of "reasonable steps" social media companies must take for compliance, described by the eSafety Commissioner as "ultimately a question for the courts to determine"
- Whether eSafety will expand compliance checks beyond the five platforms currently under investigation
- The government's future plans regarding a proposed digital duty of care legislation