Major social media platforms operating in Australia restricted access to about 4.7 million accounts identified as belonging to users under 16 in the first half of December, following the introduction of Australia’s social media minimum age requirement, according to initial figures released by the eSafety Commission.
The minimum age obligation took effect on December 10, shifting eSafety’s role from preparation to monitoring and enforcement. The regulator has focused its oversight on platforms assessed as age-restricted and identified as having high levels of under-16 usage in Australia.
Early Results Show Action Being Taken
Australia’s eSafety Commissioner Julie Inman Grant said the early data suggests major platforms are taking action to comply with the new rules. “I am very pleased with these preliminary results,” she said. “It is clear that eSafety’s regulatory guidance and engagement with platforms is already delivering significant outcomes.”
The commissioner cautioned that some under-16 accounts remain active and said it was too early to determine whether platforms have achieved full compliance. She said age assurance systems require time to be implemented fairly and accurately and emphasized the importance of continuous improvement and efforts to prevent circumvention, as outlined in eSafety’s industry guidance.
Public Education
She added that early feedback from several large age assurance providers indicated Australia’s rollout had been relatively smooth, supported by public education efforts ahead of the December deadline.
eSafety said some platforms, including Bluesky and Lemon8, have assessed themselves as meeting the criteria of the legislation and are working cooperatively with the regulator.
The regulator will continue collecting data on platform compliance and monitoring whether users migrate to other services. eSafety said it will not publish detailed compliance figures while investigations remain ongoing, citing the need to protect legal privilege and preserve enforcement options.
Broader International Pattern
Regulatory action in Australia reflects a broader international pattern, as governments increase scrutiny of how platforms manage underage access. In Europe, authorities continue to advance enforcement under the Digital Services Act, while individual countries push for additional safeguards. Denmark has proposed a nationwide ban on social media use for children under 15, framing age limits as part of a wider child safety and digital wellbeing agenda.
Beyond Europe, similar efforts are emerging across Asia. China already maintains strict youth usage controls through identity-linked access and time limits, while Malaysia has announced plans to restrict social media access for users under 16 beginning in 2026.
In the United States, age-related regulation remains fragmented, with several states pursuing verification or parental consent requirements through legislation that continues to face legal challenges.
Taken together, these measures signal a shift toward treating child safety as a core regulatory obligation for platforms, rather than a discretionary trust-and-safety function.
Jonathan is a South African content creator, photographer and videographer with 25 years of experience in journalism and print media design. He is interested in new developments in AI content creation and covers a broad spectrum of topics within the creator economy.
Major social media platforms operating in Australia restricted access to about 4.7 million accounts identified as belonging to users under 16 in the first half of December, following the introduction of Australia’s social media minimum age requirement, according to initial figures released by the eSafety Commission.
The minimum age obligation took effect on December 10, shifting eSafety’s role from preparation to monitoring and enforcement. The regulator has focused its oversight on platforms assessed as age-restricted and identified as having high levels of under-16 usage in Australia.
Early Results Show Action Being Taken
Australia’s eSafety Commissioner Julie Inman Grant said the early data suggests major platforms are taking action to comply with the new rules. “I am very pleased with these preliminary results,” she said. “It is clear that eSafety’s regulatory guidance and engagement with platforms is already delivering significant outcomes.”
The commissioner cautioned that some under-16 accounts remain active and said it was too early to determine whether platforms have achieved full compliance. She said age assurance systems require time to be implemented fairly and accurately and emphasized the importance of continuous improvement and efforts to prevent circumvention, as outlined in eSafety’s industry guidance.
Public Education
She added that early feedback from several large age assurance providers indicated Australia’s rollout had been relatively smooth, supported by public education efforts ahead of the December deadline.
eSafety said some platforms, including Bluesky and Lemon8, have assessed themselves as meeting the criteria of the legislation and are working cooperatively with the regulator.
The regulator will continue collecting data on platform compliance and monitoring whether users migrate to other services. eSafety said it will not publish detailed compliance figures while investigations remain ongoing, citing the need to protect legal privilege and preserve enforcement options.
Broader International Pattern
Regulatory action in Australia reflects a broader international pattern, as governments increase scrutiny of how platforms manage underage access. In Europe, authorities continue to advance enforcement under the Digital Services Act, while individual countries push for additional safeguards. Denmark has proposed a nationwide ban on social media use for children under 15, framing age limits as part of a wider child safety and digital wellbeing agenda.
Beyond Europe, similar efforts are emerging across Asia. China already maintains strict youth usage controls through identity-linked access and time limits, while Malaysia has announced plans to restrict social media access for users under 16 beginning in 2026.
In the United States, age-related regulation remains fragmented, with several states pursuing verification or parental consent requirements through legislation that continues to face legal challenges.
Taken together, these measures signal a shift toward treating child safety as a core regulatory obligation for platforms, rather than a discretionary trust-and-safety function.
Checkout Our Latest Podcast