The concept of verifying someone's age before they access parts of the internet is gaining traction with governments all over the world.
Multiple US states require government ID before viewing adult content. France is issuing court orders for ISPs to block popular adult websites that do not implement technology to check how old a user is. The UK will soon start enforcing "age checks" to access online pornography and start punishing sites for not doing so.
While those efforts focus on "adult" content, Australia is taking age verification to the extreme having passed the Online Safety Amendment (Social Media Minimum Age) Act. It mandates social media platforms take "reasonable steps" to prevent people under 16 from having a social media account and takes effect on the 11th of December 2025. Despite only months until that date, two important things are still unknown - which platforms have to do this and how will they do it?
We know it'll likely impact Instagram and TikTok because those are the platforms catching all the tabloid media heat, but we don't know if major platforms like YouTube, Snapchat or even online games like Roblox or Fortnite with large social aspects, will need to check a user's age.
The legislation outlines what an "age-restricted social media platform" is (the criteria are really broad!) but also says that applicable platforms will be an "electronic service specified in the legislative rules". We do not have those rules. We do not know when they are coming or even how they will be decided. It gives the Minister for Communications the ability to ban anything on the internet they think kids shouldn't see, without input from anyone except two other government bodies - the eSafety Commissioner and the Office of the Australian Information Commissioner.
Despite only months until that date, two important things are still unknown - which platforms have to do this and how will they do it?
Even more ambiguous is how platforms will actually verify which of their Australian users are over the age of 16. While the legislation says that a platform must not "collect government‑issued identification material" or use an "accredited service" to verify a user's age, that rule doesn't apply as long as there is an "alternative means" and that doing so is "reasonable in the circumstances" of checking someone's age. So can Meta or TikTok or whoever the Minister thinks is corrupting the youth of Australia, demand all the adults of Australia present their ID before logging on?
In a vain attempt to assist platforms, the government awarded a $3.8m tender in November 2024 to Age Check Certification Scheme (ACCS), a UK independent not-for-profit company, to "examine age verification, age estimation, age inference, parental certification or controls, technology stack deployments and technology readiness assessments in the Australian context". The trial is now complete and the preliminary findings were made public on June 20th, 2025.
If you were expecting a comprehensive overview of all the age assurance technologies on the market, along with the results of detailed testing and statistics like reliability and accuracy, you will be sorely disappointed. Less than five months out from the law taking effect all we were given was a bunch of Powerpoint slides with vibes and moods about the testing that took place. Some concerning statements included:
- "we did not find a single ubiquitous solution that would suit all use cases, nor did we find solutions that were guaranteed to be effective in all deployments"
- "we found limited evidence that they could cope with the evolving capacity of children (particularly through adolescence), were able to enhance the rights of children to participate in the breadth of digital experiences or were effective and secure in the management of a child's digital footprint"
- "some providers were found to be building tools to enable regulators, law enforcement or Coroners to retrace the actions taken by individuals to verify their age which could lead to increased risk of privacy breaches due to unnecessary and disproportionate collection and retention of data"
Without any data to assess for ourselves, it’s difficult to trust what ACCS is saying about age verification systems is accurate. What we do know is that age verification systems are guesstimates, at best. They use fallable technologies like:
- Using a camera to detect a user's biometric data, which has a wide margin for error (even more so with non-white skinned people), leaving many people given an incorrect age.
- Deploying AI to assess what a user posts, likes and other metadata (e.g: websites they visit) to see if it lines up with others in a similar age group, further expanding the digital surveillance on all of us, but more so on children.
- Linking a child's account to a parent's account and determining levels of access via parental controls, restricting a child or teenager's ability to explore and learn for themselves. Particularly dangerous for marginalised children and those in unsafe family situations. Also easily by-passable with a separate account.
The eSafety Commissioner's issues paper on age assurance highlights these problems - but the Age Check Certification Scheme's preliminary findings wave them away as minor issues and fails to provide their reasoning for making such statements.
In the time since ACCS's preliminary findings were released, Tim Levy, managing director of children's safety technology company Qoria, left the trial's 20-member independent stakeholder advisory board, questioning how thoroughly the various technologies were tested. Electronic Frontiers Australia's chair John Pane, who is also on the trial's advisory committee, raised concerns about why the trial data was not released publicly.
Considering the hollowness of the preliminary findings, it's not unreasonable to assume that these age assurance methods will leave a lot to be desired in the task of verifying someone's age. How paranoid will the platforms be about government enforcement of their age verification methods? If they fear government backlash over kids slipping through the digital cracks, will they be forced to ask more groups of users to provide government ID to access their platform?
If you were expecting a comprehensive overview of all the age assurance technologies on the market, along with the results of detailed testing and statistics like reliability and accuracy, you will be sorely disappointed.
How could this affect sex workers? Is this yet another thing that'll blow up our livelihoods? Maybe. People under 18 are already not supposed to be buying R18+/adult material, so there shouldn't be a big financial impact. You can absolutely expect that the systems used to attempt age verification will suck - either in false positives (people over 16 not passing as over 18) or poor user experiences that frustrate genuine customers who give up and put their credit cards or crypto wallets away. Maybe smaller platforms can't afford or justify the risk of operating an age checking service, so will block Australians to avoid getting punished by the eSafety Commissioner, further limiting the marketplaces available for sex workers.
It's also upsetting that this naive belief that kids shouldn't be exposed to "adult" topics will be yet another way to exclude queer kids from getting the information they need and finding their communities. Mainstream social media isn't a high quality source of LGBTIQ+ material at the best of times, but queerness is at least somewhat visible there. By pushing those kids into "safe spaces" controlled by adults, or more niche totally unmoderated areas of the internet (some of which may be good, but some may also be extremely bad), we are shrinking the surface area for a kid to learn about themselves. Don't just take my word for it, this is backed up by a report from the eSafety Commissioner. When we make the world smaller for kids, that's one less avenue for them to find comfort in life at an age where life is typically uncomfortable.
Got a tech question for Ada? She wants to hear from you!
Ada answers all your questions about tech, the online world, and staying safe in it. No question is too silly, no hypothetical is too far-fetched! Learn to leverage devices, systems, and platforms to your benefit.