Australia has just done something the rest of the internet can no longer ignore: it decided that, for the time being, social media access should be delayed for kids under 16. Call it bold, paternalistic, overdue or experimental. Whatever your adjective of choice, the point is this is a policy with teeth and consequences, and that matters. The law requires age-restricted platforms to take “reasonable steps” to stop under-16s having accounts, and it will begin to bite in December 2025. That deadline forces platforms to move from rhetoric to engineering, and that shift is telling.
Why I think the policy is fundamentally a good idea goes beyond the moral headline. For a decade we have outsourced adolescent digital socialisation to ad-driven attention machines that were never designed with developing brains in mind. Time-delaying access gives families, schools and governments an opportunity to rebuild the scaffolding that surrounds childhood: literacy about persuasion, clearer boundaries around sleep and device use, and a chance for platforms to stop treating teens as simply monetisable micro-audiences. It is one thing to set community standards; it is another to redesign incentives so that product choices stop optimising for addictive engagement. Australia’s law tries the latter.

Of course the tech giants are not happy, and they are not hiding it. Expect full legal teams, policy briefs and frantic engineering sprints. Public remarks from major firms and coverage in the press show them arguing the law is difficult to enforce, privacy-risky, and could push young people to darker, less regulated corners of the web. That pushback is predictable. For years platforms have profited from lax enforcement and opaque data practices. Now they must prove compliance under the glare of a regulator and the threat of hefty fines, reported to run into the tens of millions of Australian dollars for systemic failures. That mix of reputational, legal and commercial pressure makes scrambling inevitable.
What does “scrambling” look like in practice? First, you’ll see a sprint to age-assurance: signals and heuristics that estimate age from behaviour, optional verification flows, partnerships with third-party age verifiers, and experiments with cryptographic tokens that prove age without handing over personal data. Second, engineering teams will triage risk: focusing verification on accounts exhibiting suspicious patterns rather than mass purges, while legal and privacy teams try to calibrate what “reasonable steps” means in each jurisdiction. Third, expect public relations campaigns framing any friction as a threat to access, fairness or children’s privacy. It is theatre as much as engineering, but it’s still engineering, and that is where the real change happens.
There are real hazards. Age assurance is technically imperfect, easy to game, and if implemented poorly, dangerous to privacy. That is why Australia’s privacy regulator has already set out guidance for age-assurance processes, insisting that any solution must comply with data-protection law and minimise collection of sensitive data. Regulators know the risk of pushing teens into VPNs, closed messaging apps or unmoderated corners. The policy therefore needs to be paired with outreach, education and investment in safer alternative spaces for young people to learn digital citizenship.
If you think Australia is alone, think again. Brussels and member states have been quietly advancing parallel work on protecting minors online. The EU has published guidelines under the Digital Services Act for the protection of young users, is piloting age verification tools, and MEPs have recently backed proposals that would harmonise a digital minimum age across the bloc at around 16 for some services. In short, a regulatory chorus is forming: national experiments, EU standards and cross-border enforcement conversations are aligning. That matters because platform policies are global; once a firm engineers for one major market’s requirements, product changes often ripple worldwide.
So should we applaud the Australian experiment? Yes, cautiously. It forces uncomfortable but necessary questions: who owns the attention economy, how do we protect children without isolating them, and how do we create technical systems that are privacy respectful? The platforms’ scramble is not simply performative obstruction. It is a market signal: companies are being forced to choose between profit-first products and building features that respect developmental needs and legal obligations. If those engineering choices stick, we will have nudged the architecture of social media in the right direction.
The next six to twelve months will be crucial. Watch the regulatory guidance that defines “reasonable steps,” the age-assurance pilots that survive privacy scrutiny, and the legal challenges that will test the scope of national rules on global platforms. For bloggers, parents and policymakers the task is the same: hold platforms accountable, insist on privacy-preserving verification, and ensure this policy is one part of a broader ecosystem that teaches young people how to use digital tools well, not simply keeps them out. The scramble is messy, but sometimes mess is the price of necessary reform.
Sources and recommended reads (pages I used while writing):
• eSafety — Social media age restrictions hub and FAQs. https://www.esafety.gov.au/about-us/industry-regulation/social-media-age-restrictions.
• Reuters — Australia passes social media ban for children under 16. https://www.reuters.com/technology/australia-passes-social-media-ban-children-under-16-2024-11-28/.
• OAIC — Privacy guidance for Social Media Minimum Age. https://www.oaic.gov.au/privacy/privacy-legislation/related-legislation/social-media-minimum-age.
• EU Digital Strategy / Commission guidance on protection of minors under the DSA. https://digital-strategy.ec.europa.eu/en/library/commission-publishes-guidelines-protection-minors.
• Reporting on EU age verification pilots and DSA enforcement. The Verge coverage of EU prototype age verification app. https://www.theverge.com/news/699151/eu-age-verification-app-dsa-enforcement.