The countdown to Australia’s under-16 social media ban

Digital
Jonathan Cookson profile

Jonathan Cookson

Director, Digital Marketing

As the December deadline for social media age restrictions draws near, we review what’s happening in the lead up to the first law of its kind in Australia and one of the strictest globally. 

What is due to happen on December 10 

It will become illegal for social media platforms to allow children under the age of 16 to open or maintain an account in Australia. The law introduces a minimum age requirement and imposes a legal obligation for platforms to take reasonable steps to both detect and prevent underage users from accessing their services. Companies found to be non-compliant can face fines of up to $7.8 million. 

The law is part of a broader legislative shift aimed at improving online safety for children. It is backed by a mix of legislative instruments, industry codes, and regulatory guidance, many of which are enforceable and carry hefty penalties. Although the minimum age obligation has attracted the most public attention, it is only one piece of a growing regulatory puzzle. 

Which big tech companies and platforms does this effect? 

This legislation brings into focus what constitutes social media. It applies to “age-restricted social media platforms”, defined as services that: 

  • Allow users to interact with others 
  • Enable the sharing of user-generated content 
  • Are designed for or used predominantly for social purposes 
  • Are likely to be used by children in Australia 

While the legislation does not explicitly name services, recent reporting confirms that eSafety approached 16 companies to self-assess whether they need to comply. This included expected social media channels such as Instagram, Snapchat and TikTok. In a move that may surprise some, services like WhatsApp, Discord, Roblox and Pinterest were among the 16. Even Match, which operates dating apps like Hinge and Tinder, made the list.

Some of these companies have already attempted to distance themselves from the “social media” label. YouTube, for instance, has framed itself as a video platform, and Pinterest describes itself as a visual search engine. However, under the legislation, it is functionality and use – rather than how a service brands itself – that matters. Platforms may still argue they do not qualify, and inclusion could ultimately be tested in court. 

Video gaming is included? 

While traditional multiplayer games may not obviously resemble social media, games or services that feature chat, friend lists or content sharing could fall within the law’s scope. 

Gaming-related services such as Steam, Lego Play, Twitch and Roblox have already been approached by eSafety, highlighting just how broadly the law could apply. Some companies are likely to argue that they do not qualify as social media, and in some cases their inclusion may eventually result in legal proceedings. Roblox has been removed from the banned list, 

An emerging platform to watch is the Nintendo Switch 2. It includes a new built-in communication feature called GameChat, enabling users to voice chat, share screens and even video call using a camera accessory. Although unlikely, as it’s widely used by under-16s and found to meet the social interaction criteria, it too could fall within the law’s reach. 

twitch app
A phone displaying the Twitch icon, a live-streaming platform for content spanning video games and “in real life” streams.

What about YouTube? 

There has been quite the tussle between policy makers and YouTube since late 2024.

A high-profile exemption was granted in November after lobbying by YouTube’s CEO, on the grounds that YouTube and YouTube Kids provide educational and health content to children. 

This decision triggered criticism from rival platforms and online safety experts, with TikTok describing it as a “sweetheart deal”. The exemption was eventually reversed in July 2025, meaning YouTube will now be expected to comply with the same minimum age obligations as other platforms by the December deadline.  In a further twist, as of November 5, YouTube Kids will no longer be included in the banned list.

What needs to be done and how is it enforceable? 

Platforms must show they are taking reasonable steps to identify and remove underage accounts, prevent re-registration, and apply layered age assurance. This includes deactivating accounts with care, avoiding reliance on self-declared ages, and offering transparent review processes for users who believe they have been wrongly flagged. 

The law has sparked concern among parents, privacy advocates and platforms about how enforceable these measures really are. While the legislation expects platforms to proactively detect underage users trying to re-register, it stops short of requiring universal age verification. 

This is where the Age Assurance Technology Trial becomes relevant. Commissioned by the government and published in August 2025, the trial found that although age estimation technologies are improving, they remain prone to errors, particularly for users close to the age threshold. The findings support a layered approach that combines behavioural data, metadata and estimation tools, as the most effective and least intrusive method. 

The Adolescence effect and what mental health experts say 

The global success of Adolescence – Netflix’s award-winning psychological crime drama that premiered in March – has amplified public concern over social media’s influence on young minds. 

The government has leaned heavily on a mental health narrative to justify the legislation, citing high-profile cases of cyberbullying and youth suicide as evidence for urgent reform. But many in the mental health sector are urging caution. 

Patrick McGorry, founder of Headspace and Executive Director of Orygen, has publicly criticised the ban, arguing that social media is an easy scapegoat. He points instead to structural drivers of youth distress – housing insecurity, climate anxiety, and unstable employment – as more pressing and complex root causes. Leading mental health organisations including the Black Dog Institute and Beyond Blue have also voiced opposition. 

Others, like Milly Bannister of youth wellbeing charity ALLKND, acknowledge that digital platforms can be harmful – but argue that banning access won’t solve the problem. Instead, Bannister advocates for redesigning online spaces to be safer, healthier, and purpose-built for young users. 

Will we see age restrictions on AI tools? 

There are growing calls to regulate access to AI large language models like ChatGPT, particularly after the tragic case of Adam Raine, a 19-year-old in the US who reportedly died by suicide following harmful interactions with an AI chatbot. ChatGPT already has an age restriction of 13 years old for account creation. 

eSafety’s updated codes already include AI and AI chatbots within scope, and more specific obligations could follow in 2026 as governments globally consider new regulatory frameworks for AI. 

Short-term impacts on sectors 

In the lead-up to 10 December, organisations will need to assess and respond to changes affecting both their digital operations and stakeholder expectations. For many, this means evaluating how their services interact with users under 16 and whether any online engagement, promotional content or platform features may fall under new regulatory scrutiny. 

The Australian education sector is likely to face the most immediate impact, with YouTube frequently used in classrooms, lesson plans and school-managed devices. Schools may need to revisit how video content is accessed in class, especially when student accounts or personal devices are involved. 

These changes require swift, strategic responses from affected sectors – not only to ensure compliance with the law, but to maintain the confidence of communities that rely on digital tools for education, communication and support. We’ll be delving into the education sector related impacts in more detail soon, with a second Perspectives piece in the coming weeks.

Get in touch

Whether you have a project in mind or just want to learn more, reach out anytime.

Share this story

Digital