-
What We Do
-
How We Think
-
Our People
-
Our Offices
-
About Us
-
Get In Touch
Australia is set to fundamentally change the rules of engagement for social media platforms and their youngest users with the introduction of landmark legislation. The Online Safety Amendment (Social Media Minimum Age) Bill 2024 enacted in December 2024 (the Bill), amends the Online Safety Act 2021 to establish a new mandatory minimum-age framework. From 10 December 2025, age-restricted social media platforms must take 'reasonable steps' to prevent Australians under 16 from holding accounts. This marks a decisive shift from platform self-regulation to mandated accountability in the digital sphere.
Currently, social media access for minors in Australia is primarily governed by the platforms' own Terms of Service. For almost all major services, including Facebook, Instagram, TikTok, YouTube and Snapchat, the standard minimum age for account creation is 13 years old.
This 13-year-old benchmark is legacy of the United States (US) Children's Online Privacy Protection Act 1998 (COPPA). COPPA is fundamentally a privacy law designed to prevent online services and websites from collecting personal information from children under 13 without consent. As platforms were compelled to comply with this US legislation to operate globally, they adopted 13 as the de-facto global minimum age of access.
However, as noted in the Explanatory Memorandum for the Bill, COPPA predates the existence of modern social media and is neither an indication that these services are safe to use at this age nor evidence that 13 is an appropriate age for adolescents to safely engage with today's complex and algorithmically driven platforms. The reliance on these outdated industry standards has created a significant and dangerous gap between policy and reality in today's digital environment.
The legislative change introduces a new minimum age obligation, shifting responsibility for ensuring fundamental protections for users under 16 from parents and children to social media platforms. Once the legislation takes effect on 10 December 2025, platforms will be required to take reasonable steps to prevent Australians aged under 16 from maintaining accounts.
More importantly, the obligation applies not only to new accounts, but also to the millions of existing accounts held by Australian users under 16. Social media platforms will need to establish reactive systems to detect and deactivate underage users. For example, if a user is 14 years old and has an active account on a social media platform such as Snapchat when the law commences, Snapchat will be required to deactivate that account until the user turns 16.
The legislation also imposes substantial penalties, with non-compliance carrying severe financial consequences. Corporations that breach the minimum age obligation may face fines of up to $49.5 million. Moreover, the maximum penalties for breaches of industry codes and standards have been increased to the same level, reflecting the seriousness of such contraventions and ensuring that enforcement aligns with community expectations.
This reform represents a fundamental shift in accountability, moving the burden of online protection from parents and children to the tech giants that created and profited from these platforms.
The legislative intervention is fundamentally anchored in the government's duty, guided by international human rights conventions, to protect and to act in the best interests of children. In designing the Bill, the Government engaged key principles that necessitate proactive protection from foreseeable online harms.
The law specifically seeks to uphold rights such as:
The urgency of this protective framework is tragically demonstrated by criminal cases such as DPP (ACT) v Doughty [2023] ACTSC 397. This case involved an offender who committed serious sexual offences between 2015 and 2021, having established initial contact with underaged victims via social media platforms such as Snapchat and Facebook. This tragic example illustrates how the previous absence of a robust legislative minimum age allowed social media platforms to become a primary avenue for child predation.
The Digital Deadline of December 2025 marks a profound shift in Australia's approach to technology regulation and represents a bold commitment by the Australian government to redefine digital childhood safety. By placing the onus on platforms (that is, the tech giants) to enforce a robust minimum age, this change sets a new global standard for digital protection in an era where the harms of social media are increasingly evidencet and call for stronger systemic safeguards.