In a significant move to protect the mental wellbeing of young users, Australia has announced a pioneering ban on social media access for individuals under 16. The legislation, which officially passed in November 2024, aims to delay the age at which teenagers can set up social media accounts from thirteen to sixteen years old.
Key Features of the Ban
- The ban includes major platforms like Facebook, TikTok, and Snapchat, with additional companies such as WhatsApp, Reddit, and Roblox under consideration.
- Australia will be the first country to impose such restrictions, prompting other nations to monitor its progress closely.
- Prime Minister Anthony Albanese emphasised the need for social media firms to utilise artificial intelligence to estimate user ages, avoiding broad age-verification processes.
Global Attention and Inspiration
During a recent event in New York, Albanese drew international attention to Australia’s approach, stating, “The challenge we face is constantly evolving,” and highlighting the role of different countries in tackling the risks posed by social media.
European Commission President Ursula von der Leyen expressed inspiration from Australia’s initiative, indicating that Europe may look to implement similar policies. Albanese noted that these regulations are crucial for allowing kids three more years of development free from the influences of algorithms and online misinformation.
Regulatory Oversight and Challenges
As part of the enforcement strategy, the eSafety Commissioner in Australia, Julie Inman Grant, has reached out to over a dozen tech companies for a self-assessment regarding their eligibility under the ban. Frameworks for oversight remain vague, but penalties for non-compliance could reach A$49.5 million (approximately £25.6 million).
Inman Grant has stated that some cases for exemption from the ban appear straightforward, but due diligence will be exercised, allowing firms to present their cases. This has raised concerns that the regulations may become more symbolic than enforceable.
Focus on Child Safety
In line with this initiative, new regulations will also target harmful online content. These rules are designed to further prevent children from being exposed to inappropriate material, including online pornography and harmful AI interactions.
Roblox, for instance, has already taken steps to limit adult interactions with children on its platform, recognising the potential risks of grooming and inadequate oversight.
As Australia forges ahead with this world-first legislation, the international community watches closely for outcomes that could shape digital policies worldwide.