Connect with us

Net Influencer

Commentary

Denmark Proposes Introducing Sweeping Youth Social-Media Ban

Denmark’s government has announced an agreement to ban access to social media for anyone under 15, positioning the country as among the most aggressive in Europe in regulating children’s online activity. The proposal – which will require formal legislation and parliamentary vote before becoming binding – comes amid growing concerns that children are increasingly exposed to harmful digital content and commercial pressures.

In a recent AP report, Digital Affairs Minister Caroline Stage Olsen said the decision reflects evidence that a vast majority of Danish children are already active on social platforms despite existing age limits. According to Stage, 94% of children under 13 in Denmark have profiles on at least one social media platform, and more than half of those under 10 do.

Stage said the government intends to reduce exposure to violence, self-harm content, and other risks she described as “too great” for children. She also noted that although large technology companies generate significant revenue, they “are simply not willing to invest in the safety of our children.”

Under the agreement, parents may be permitted to grant access starting at age 13 following an assessment process. Stage said lawmakers would take months to draft and pass legislation and emphasized the need to prevent loopholes.

Denmark’s Approach: Enforcement and Age Verification

Danish officials have not yet detailed how the ban will be enforced, particularly given how easily children access digital devices. Stage said Denmark plans to use its national electronic ID system – already held by nearly all citizens over 13 – and develop an age-verification app. She added that while Denmark cannot force platforms to adopt its app, it can require “proper age verification” and said companies could face fines of up to 6% of global income through EU enforcement channels if they do not comply.

The Danish ministry said the legislative effort is aimed at shielding young people from harmful content and online pressures, not excluding them from digital life altogether. It cited concerns that children’s sleep, concentration, and well-being are disrupted by online activity and noted that parents and educators cannot address these issues alone.

Existing EU rules already prohibit children under 13 from holding accounts on platforms such as TikTok, Instagram, YouTube, Twitch, Reddit, and Discord. Many platforms have established their own minimum age policies and deploy systems such as selfie-based age estimation and AI-driven verification. TikTok said it “recognizes the importance of Denmark’s initiative” and highlighted teen-focused safety features, including Family Pairing. Meta did not immediately respond to requests for comment in the AP report.

Stage said previous attempts to encourage voluntary platform action have been insufficient. “We’ve given the tech giants so many chances to stand up and to do something about what is happening on their platforms. They haven’t done it,” she said.

Other Countries Consider Similar Measures

Denmark’s proposal comes as several countries explore age-based restrictions for youth social media use. In Europe, several governments have raised similar concerns. In the Netherlands, the caretaker government recently issued national guidelines advising that children should avoid social media before age 15, citing risks related to mental health, addiction, and exposure to harmful content. 

France has gone further, with a parliamentary commission recommending a formal ban for those under 15. The commission, launched in March after families sued TikTok alleging exposure to content linked to suicide, conducted hearings with parents and platform representatives. Its report says a ban would “send a signal” that social media is “not harmless” before age 15. It also proposes a “digital curfew” for those aged 15 to 18, restricting access between 10 p.m. and 8 a.m.

The commission suggests an even broader step: banning social media for all minors under 18 if platforms fail to meet obligations under the EU’s Digital Services Act within three years. Additional recommendations include a national information campaign about online risks and establishing a “digital negligence offense” targeting irresponsible parental oversight. According to NDTV World, French officials say the success of any age-limit policy will depend on effective age verification, which remains technically challenging and is met with resistance from some platforms.

Beyond Europe, China continues to enforce strict limits on online gaming for minors and has proposed additional rules to curb smartphone use among children. While the policies differ from social-media-specific bans, they reflect similar concerns about youth well-being and the impact of digital services on minors.

Australia Implements First National Youth Social-Media Ban

Meanwhile, according to a Reuters report, Australia became the first country to enact a nationwide minimum age for social media use when its parliament approved a ban setting the threshold at 16. The law applies to major social platforms, including TikTok, Facebook, Instagram, Snapchat, YouTube, X, and Reddit.

The legislation imposes penalties of up to AU$49.5 million for systemic failures to prevent under-16s from maintaining accounts. The law takes effect on December 10, 2025, and platforms have begun preparing for compliance while continuing to question the policy’s effectiveness.

Meta told Australian lawmakers it will contact approximately 450,000 underage account holders across Facebook and Instagram. Users will be asked to delete their data or store it until they turn 16. Snap said it disagrees with the law but “will abide by it,” while TikTok and Snap both reported large numbers of underage users who will be subject to enforcement.

Platforms plan to use behavioral tracking tools to identify users who claim to be 16 or older but appear younger. Meta and TikTok said that users incorrectly flagged could verify their age through third-party age-estimation services. 

According to research cited by Australia’s eSafety Commissioner, around 37% of surveyed children reported encountering harmful content on YouTube. Additional findings from the same body of research showed that exposure was highest among younger users, rising to 46% among children aged 10 to 12.

Checkout Our Latest Podcast

Avatar photo

Dragomir is a Serbian freelance blog writer and translator. He is passionate about covering insightful stories and exploring topics such as influencer marketing, the creator economy, technology, business, and cyber fraud.

Click to comment

More in Commentary

To Top