Snapchat has suspended more than 415,000 accounts in Australia belonging to users under the age of 16, following the enforcement of a new law that bans minors from accessing major social media platforms.
The action comes as Australia’s digital safety law, which took effect on December 10, requires large platforms to prevent users under 16 from holding accounts. Companies that fail to take “reasonable steps” to comply face penalties of up to A$49.5 million.
According to Australia’s eSafety regulator, technology companies have collectively blocked around 4.7 million underage accounts since the law was implemented. Snapchat said it continues to identify and disable underage accounts on a daily basis.
Snapchat acknowledged that age-verification systems are not fully accurate, noting that current technologies can misjudge users’ ages by several years. The company warned that this could result in some underage users bypassing restrictions, while older teenagers may be incorrectly blocked.
Under the legislation, platforms including Snapchat, Meta, TikTok and YouTube are required to restrict access for users under 16, marking a world-first attempt to impose an age-based ban at the platform level. The move reflects growing concern among Australian authorities over online harms, including child safety risks, grooming, and mental health impacts.
Snapchat has urged regulators to require age verification at app-store level, arguing that a centralized system involving Apple and Google could be more effective in preventing circumvention. However, critics and privacy advocates have raised concerns about data protection, accuracy, and the broader implications of large-scale age verification.
The Australian government says the law is aimed at strengthening online safety for children, while debate continues over its long-term effectiveness, privacy implications, and impact on how young people communicate online.
Today's E-Paper