Australia is on the cusp of a profound transformation in its online landscape, with forthcoming regulations set to challenge the long-held internet adage, “On the internet, nobody knows you’re a dog.” While the Albanese government recently celebrated the passage of legislation prohibiting individuals under 16 from accessing social media, effective in December, it is the new industry codes, meticulously crafted by the tech sector in collaboration with eSafety Commissioner Julie Inman Grant under the Online Safety Act, that are anticipated to wield a far greater influence over Australians’ internet access.
Online service providers are reportedly preparing to implement these impending measures, which will encompass various age-verification techniques. Such methods could include scrutinizing account histories, employing facial age assurance technologies, and conducting bank card checks. Furthermore, starting in December, logged-in accounts for search engines will also require identity verification using official documents like driver’s licences to restrict social media access for those under 16. This mandate stems from an industry code that was enforced at the close of June. Under this code, search engines will be compelled to implement age assurance measures for all accounts. Should an account holder be identified as under 18, the search engine would be required to activate safe search features, thereby filtering out explicit content, such as pornography, from search results.
The eSafety Commissioner is currently reviewing six additional draft codes, which propose to introduce analogous age assurance protocols across a broad spectrum of services routinely utilized by Australians. These services include app stores, AI chatbots, and messaging applications. Any service that hosts or facilitates access to content deemed unsuitable for children, such as pornography, self-harm material, simulated gaming, or highly violent content, will be obligated to ensure that children are effectively prevented from accessing such material. Commissioner Inman Grant, during her address at the National Press Club last month, emphasized the critical necessity of these codes to safeguard children at every level of the online environment. She reportedly stated the importance of a layered safety approach that also assigns responsibility and accountability at crucial choke points within the tech stack, including app stores and at the device level, which she described as the physical gateways to the internet where children initially register and declare their ages.
While the eSafety Commissioner had previously announced the intent of these codes during their development and submission phases, recent media coverage has reignited public attention to these particular aspects. Some segments of the population are likely to welcome the changes. For instance, recent reports indicating that Elon Musk’s AI, Grok, now features a pornographic chat while still being labelled suitable for ages 12+ on the Apple App Store prompted child safety organizations to urge Apple to re-evaluate the app’s rating and implement robust child protection measures within its app store. Notably, both Apple and Google are reportedly already developing age checks at the device level, which could also be leveraged by applications to verify the age of their users.
Justin Warren, the founder of the tech analysis firm PivotNine, suggested that the codes would institute sweeping changes to the regulation of communication among individuals in Australia. He characterized the move as a significant overreaction following years of policy inaction in curtailing the power of a handful of large foreign technology companies. Warren also highlighted what he described as a “darkly hilarious” aspect: that the measures would effectively grant even greater power and control over Australians’ online lives to these same foreign tech companies.
Conversely, Digi, one of the industry bodies that collaborated with the eSafety Commissioner in developing the codes, refuted the assertion that the measures would diminish online anonymity. Dr Jenny Duxbury, Digi’s director of digital policy, clarified that the codes are specifically aimed at platforms that host or provide access to particular categories of content. She reportedly stated that the codes introduce targeted and proportionate safeguards concerning access to pornography and materials rated as unsuitable for minors under 18, including highly violent content or those advocating or providing instructions for suicide, eating disorders, or self-harm.