Connect with us

Net Influencer

Commentary

UK Regulators Demand Stricter Age Verification for U-13s From Facebook, TikTok, YouTube

UK media regulator Ofcom (Office of Communications) and data watchdog the ICO (Information Commissioner’s Office) have formally contacted seven major platforms, including Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, and X, demanding more robust age verification measures for users under 13, according to a BBC report.

Ofcom currently requires “highly-effective age checks” only for services providing over-18 content, such as pornography. The regulators are now asking platforms to voluntarily apply comparable standards to prevent underage children from signing up. Most platforms currently rely on self-reported age at registration, a method the ICO called “easily circumvented.”

“Services are currently failing to put children’s safety at the heart of their products,” Ofcom Chief Executive Dame Melanie Dawes said in a statement.

The ICO’s open letter, signed by Chief Executive Paul Arnold, noted that platforms with a minimum age of 13 “generally have no lawful basis for processing the personal data of children under that age on their service.” Ofcom’s own research indicates 86% of children aged 10-12 already have their own social media profile.

Platform Responses

The platforms offered varying responses, according to the BBC. 

Meta said it already uses AI to detect users’ ages based on activity and employs facial age estimation technology. 

TikTok said it uses “enhanced technologies” to detect and remove underage accounts, claiming to have removed over 90 million suspected under-13 accounts between October 2024 and September 2025. 

Snapchat said it is testing age verification tools. Roblox cited 140 new safety features released in the past year, including mandatory age checks for chat access. X did not respond to a request for comment.

Google pushed back more directly, saying YouTube was “surprised” by Ofcom’s approach and urging regulators “to focus instead on high-risk services that are failing to comply with the codes set out in the Online Safety Act.”

International Context

The UK action fits within a broader global pattern of governments tightening platform obligations around underage access. 

Australia restricted approximately 4.7 million under-16 accounts following the introduction of its social media minimum age requirement in December 2025. The European Union’s AI Act transparency obligations, which take effect in August 2026, will also require certain AI-generated content to be clearly identifiable, with implications for branded and creator content. New York State’s Synthetic Performer Disclosure Law, effective June 9, 2026, adds further disclosure requirements for AI-generated likenesses in advertising.

Jonathan Oberholster

Jonathan is a South African content creator, photographer and videographer with 25 years of experience in journalism and print media design. He is interested in new developments in AI content creation and covers a broad spectrum of topics within the creator economy.

Click to comment

More in Commentary

To Top