When Free Speech Meets Law: X Pushes Back Against Australia’s Under-16 Social Media Ban
Elon Musk’s company X has formally urged Australian authorities to delay the rollout of a law banning under-16s from social media, arguing questions of legality, enforcement and unintended harm. The clash raises foundational tensions between child protection, platform responsibility and rights in the digital age.
DIGITAL CREATORS


In the coming months Australia plans to become the first country to bar people under 16 from maintaining social media accounts, turning a spotlight on how democratic societies regulate digital life. But now a powerful actor has intervened. X, led by Elon Musk, has issued a public submission demanding a delay to the law’s December 2025 start date, citing “serious concerns” over legality, enforceability, and unintended consequences. (The Guardian)
This isn’t just a tech company pushing back. It is a clash of values: how far should governments reach into online systems, and how much burden should lie on platforms to enforce childhood boundaries?
What X Is Arguing
In its formal submission, X makes a multi-pronged case:
The policy’s lawfulness is uncertain, especially its compatibility with international human rights instruments such as the UN Convention on the Rights of the Child and the International Covenant on Civil and Political Rights. (The Guardian)
The compliance burden is onerous. X argues enforcement should begin only after regulatory guidelines are published and that there should be a six-month compliance window. (The Guardian)
The threat of a punitive regime looms large. The legislation allows fines of up to A$50 million for noncompliance. (The Guardian)
The law risks perverse outcomes: minors barred from regulated platforms might migrate to less safe or unmoderated ones. (The Guardian)
Unclear definitions in the law—such as which services count as “social media”—could lead to regulatory overreach or weaponization. (The Guardian)
X contends that instead of platform bans, age assurance should operate at the device (smartphone) level. (The Guardian)
X reports that fewer than 1% of its Australian users are under 16, yet it says the company does not object in principle to safeguarding youth online — it wants clarity, fairness, and time. (The Guardian)
The Law Behind the Ban
This policy is not speculative — it is grounded in existing legislation. The Online Safety Amendment (Social Media Minimum Age) Act 2024 amends the Online Safety Act 2021 to require platforms to take “reasonable steps” to prevent under-16s from accessing their services. (Wikipedia)
The law passed in Parliament in November 2024 and is set to commence in December 2025. (Wikipedia) Under the legislation, platforms deemed to have systemic failures in enforcement may be fined, though the eSafety Commissioner has said enforcement will initially focus on identifying "systemic failures" rather than targeting individual firms. (The Guardian)
Australia’s government stresses that the law is essential to protect youth from algorithmic harm, cyberbullying, predation, and the mental health effects of unsupervised access. (Reuters)
Points of Tension and Risk
1. Rights vs Protection
One of the central tensions is between protecting minors and preserving their rights to information, expression, education, and connectivity. X argues that a blanket ban could contravene fundamental freedoms protected domestically and under treaties. (The Guardian)
2. Enforcement Feasibility
Enforcing identity checks or age verification at scale is technically complex. Minors may use VPNs, false IDs, or alternative platforms. If enforcement is weak, the law risks being symbolic rather than effective. (The Guardian)
3. Displacement Risk
If under-16s are forced off regulated platforms, they may migrate to spaces without moderation or oversight. That carries higher risk than regulated exposure. (The Guardian)
4. Overbroad Application
Ambiguous definitions of “social media” in the law may sweep in platforms not intended or appropriate, exposing private services or content systems to heavy regulation. (The Guardian)
5. Incentives and Accountability
With enormous fines and compliance risk, platforms might err toward excessive censorship or surveillance, negatively affecting general users.
What Now: Paths Forward
Delay is reasonable. A six-month buffer after the publication of clear regulatory guidelines is a prudent approach. It allows platforms, developers, and regulators to align on technical standards.
Clarity and scope narrowing. Define precisely which platforms must comply. Carve out exceptions (e.g. education, health, private messaging) to prevent overreach.
Pilot and feedback. Run age assurance tests in smaller cohorts first and iterate. Learn from missteps rather than imposing a blunt rollout.
Rights protections built in. Any regime must include safeguards for privacy, appeal, transparency, redress.
Monitor unintended effects. Track migration, content risks in dark corners, and measure whether the ban actually reduces harm.
Final Thoughts
When governments legislate what is permissible in digital spaces, they wield structural power over attention, voice, access, and identity. The under-16 social media ban is ambitious, arguably well intentioned, but fraught with contradiction and complexity.
X’s call for delay is more than corporate self-interest. It reflects a deeper tension: between regulatory urgency and institutional readiness; between protection and rights; between alpha control by governments and the messy free terrain of networks.
At TMFS we believe that public policy in digital life must be anchored in transparency, iterative design, and an ethos that power over platforms comes with humility and guardrails. If Australia does proceed with this law, the test will not lie in speeches or fines, but in how fairly it treats children, how well it protects rights, how it avoids pushing the vulnerable further underground.
This is not a binary fight between youth safety and freedom. It is a complex negotiation. Let delay provide space for that negotiation.