Australia’s Online Safety Crackdown: Social Media Giants Face Legal Scrutiny Over Child Account Policies

In a significant move towards enhancing online safety for minors, Australia’s online safety watchdog is contemplating legal action against major social media platforms, including Meta (Facebook and Instagram), Snapchat, TikTok, and YouTube. This decision comes in the wake of allegations that these companies are not fully complying with recently enacted laws designed to prevent children under the age of 16 from maintaining accounts on their platforms.
Background on the Legislation
The Australian government implemented strict regulations aimed at protecting children from potential online dangers. The law, which took effect on December 10, 2025, was designed to ensure that social media platforms are vigilant in verifying the ages of their users and preventing underage children from creating accounts. These measures are part of a broader initiative to create a safer online environment for children, who are increasingly exposed to harmful content and interactions on social media.
Allegations of Non-Compliance
On March 31, 2026, Communications Minister Anika Wells publicly criticized these platforms for what she describes as minimal compliance efforts. According to Wells, the companies have been accused of intentionally doing the bare minimum to undermine the effectiveness of the laws. This raises serious concerns about the platforms’ commitment to safeguarding young users.
Snapchat’s Response
Among the major platforms, Snapchat appears to have taken some action, having reportedly locked approximately 450,000 accounts in compliance with the new regulations. However, the efforts by other platforms have not been as transparent or robust.
Concerns Over Transparency
While Snapchat has made strides in addressing the legislation, Meta, TikTok, and YouTube have been less forthcoming about their compliance efforts. The lack of detailed information regarding their mechanisms for age verification and account management has raised alarms among regulators. This opacity complicates the ability of the Australian government to assess whether these platforms are genuinely adhering to the law.
Legal and Regulatory Implications
The Australian Communications and Media Authority (ACMA) is now weighing its options, including the possibility of taking these companies to court. Should this occur, it would set a significant precedent in the realm of online safety regulations and how they are enforced against large tech companies.
What Happens Next?
Legal experts suggest that if the courts become involved, they will need to determine what constitutes reasonable steps for social media platforms to take under the new legislation. This could involve a range of measures, including:
- Enhanced Age Verification: Implementing more rigorous methods to verify the ages of users.
- Education and Awareness: Initiating educational campaigns to inform both parents and children about the risks associated with social media.
- Monitoring and Reporting: Developing systems for monitoring young users’ activities and reporting any suspicious behavior.
The Role of Parents and Guardians
As the legal landscape evolves, the role of parents and guardians in monitoring their children’s online presence remains crucial. While legislation is a step forward, the effectiveness of such laws ultimately hinges on active parental involvement and awareness. Parents are encouraged to stay informed about the platforms their children use and to engage in open conversations about online safety.
Global Context
Australia’s initiative is part of a growing global trend towards stricter regulations on social media usage among minors. Countries across the world are grappling with similar issues, and many are considering or have already implemented laws aimed at protecting children from online harm. This international movement reflects a collective recognition that social media companies have a responsibility to create safer environments for their young users.
Conclusion
The situation in Australia highlights the critical need for social media platforms to take their responsibilities seriously when it comes to protecting children. As the ACMA deliberates its next steps, the outcome could have far-reaching implications not only for the companies involved but also for the future of online safety legislation worldwide. The ongoing discourse will likely influence how other countries approach similar challenges, as the conversation about child safety on the internet continues to gain momentum.



