The European Commission plans to introduce a “Digital Fairness Act” aimed at addressing online consumer-protection concerns, including misleading influencer marketing practices.
According to a new European Parliamentary Research Service (EPRS) briefing, the initiative is expected to cover “misleading influencer marketing; dark patterns that can unfairly influence consumer decisions; the addictive design of digital products; and unfair personalisation practices that take advantage of consumers’ vulnerabilities.”
The act is also positioned to clarify responsibilities across the influencer-marketing value chain, following what the briefing calls “a certain degree of legal uncertainty” identified in a 2024 fitness check of EU consumer law.
At the same time, the European Parliament has adopted resolutions calling for influencer advertising to be addressed directly within the act and urging the Commission to “prohibit platforms from monetising or otherwise providing financial or material incentives for kidfluencing.” These developments place influencer marketing within a wider set of digital-policy priorities focused on consumer protection, the role of platforms and safeguards for minors.
Sweep Reveals Extent of Undisclosed Commercial Content
The Commission and national authorities conducted a coordinated review of influencer activity in 2024 to assess compliance with existing obligations. Authorities examined posts from 576 influencers across major platforms, including TikTok, Instagram, YouTube, Facebook, X, Snapchat and Twitch.
The sweep found that 97% of influencers posted commercial content, but only about 20% systematically disclosed that posts were advertisements. Approximately 38% did not use platform-provided labels such as “paid partnership,” instead relying on terms like “collaboration” or “thanks to the brand.” The EPRS notes that only 36% of influencers in the sample were registered as traders nationally, and 30% did not provide company details in their posts.
The review also identified the promotion of products and services with potential consumer-protection implications. Authorities reported that one in five influencers screened promoted “unhealthy or hazardous activities,” including junk food, alcoholic beverages, medical or aesthetic treatments, gambling or financial services such as crypto trading.
A United Kingdom study referenced in the briefing provides additional context, estimating that 22% of social media users aged 16-60 have purchased counterfeit goods after an influencer recommended them.
Existing EU Framework Applies to Influencer Activity
Even before the Digital Fairness Act, several EU rules already apply to influencer marketing.
Under the Unfair Commercial Practices Directive, hidden advertising is prohibited, and misleading practices can arise when influencers omit to disclose commercial intent. Guidelines clarify that remuneration includes both monetary payment and benefits such as free products or services.
The Digital Services Act (DSA) requires platforms to offer tools enabling users, including influencers, to declare commercial communications. Platforms must also collect identity information from influencers acting as traders.
Very large online platforms and search engines must assess systemic risks, including illegal content and undisclosed advertising. According to the EPRS, the European Commission issued preliminary findings in 2025 that TikTok did not provide necessary information about advertising content and targeting, and that Meta lacked simple mechanisms for users to report illegal content or contest moderation decisions.
These requirements form the backdrop for the Digital Fairness Act, which aims to address ongoing gaps in enforcement and clarity.
Child Influencers Draw Legislative Attention
Research cited in the briefing highlights concerns specific to child influencers.
A 2025 study describes kidfluencing as “a new form of child labour, in which child’s play is monetised as work,” noting risks including privacy violations, economic exploitation, and emotional and psychological harm. Another study finds that the online profiles of kidfluencers aged 7-12 may reflect how parents wish to present their children rather than an authentic self-representation.
These findings have informed recent parliamentary positions. The Parliament’s November 2025 resolution calls on the Commission to prevent platforms from offering monetisation or other incentives for kid-directed content, aligning child-protection concerns with broader transparency and consumer-protection goals in the upcoming Digital Fairness Act.
Dragomir is a Serbian freelance blog writer and translator. He is passionate about covering insightful stories and exploring topics such as influencer marketing, the creator economy, technology, business, and cyber fraud.
The European Commission plans to introduce a “Digital Fairness Act” aimed at addressing online consumer-protection concerns, including misleading influencer marketing practices.
According to a new European Parliamentary Research Service (EPRS) briefing, the initiative is expected to cover “misleading influencer marketing; dark patterns that can unfairly influence consumer decisions; the addictive design of digital products; and unfair personalisation practices that take advantage of consumers’ vulnerabilities.”
The act is also positioned to clarify responsibilities across the influencer-marketing value chain, following what the briefing calls “a certain degree of legal uncertainty” identified in a 2024 fitness check of EU consumer law.
At the same time, the European Parliament has adopted resolutions calling for influencer advertising to be addressed directly within the act and urging the Commission to “prohibit platforms from monetising or otherwise providing financial or material incentives for kidfluencing.” These developments place influencer marketing within a wider set of digital-policy priorities focused on consumer protection, the role of platforms and safeguards for minors.
Sweep Reveals Extent of Undisclosed Commercial Content
The Commission and national authorities conducted a coordinated review of influencer activity in 2024 to assess compliance with existing obligations. Authorities examined posts from 576 influencers across major platforms, including TikTok, Instagram, YouTube, Facebook, X, Snapchat and Twitch.
The sweep found that 97% of influencers posted commercial content, but only about 20% systematically disclosed that posts were advertisements. Approximately 38% did not use platform-provided labels such as “paid partnership,” instead relying on terms like “collaboration” or “thanks to the brand.” The EPRS notes that only 36% of influencers in the sample were registered as traders nationally, and 30% did not provide company details in their posts.
The review also identified the promotion of products and services with potential consumer-protection implications. Authorities reported that one in five influencers screened promoted “unhealthy or hazardous activities,” including junk food, alcoholic beverages, medical or aesthetic treatments, gambling or financial services such as crypto trading.
A United Kingdom study referenced in the briefing provides additional context, estimating that 22% of social media users aged 16-60 have purchased counterfeit goods after an influencer recommended them.
Existing EU Framework Applies to Influencer Activity
Even before the Digital Fairness Act, several EU rules already apply to influencer marketing.
These requirements form the backdrop for the Digital Fairness Act, which aims to address ongoing gaps in enforcement and clarity.
Child Influencers Draw Legislative Attention
Research cited in the briefing highlights concerns specific to child influencers.
A 2025 study describes kidfluencing as “a new form of child labour, in which child’s play is monetised as work,” noting risks including privacy violations, economic exploitation, and emotional and psychological harm. Another study finds that the online profiles of kidfluencers aged 7-12 may reflect how parents wish to present their children rather than an authentic self-representation.
These findings have informed recent parliamentary positions. The Parliament’s November 2025 resolution calls on the Commission to prevent platforms from offering monetisation or other incentives for kid-directed content, aligning child-protection concerns with broader transparency and consumer-protection goals in the upcoming Digital Fairness Act.
Checkout Our Latest Podcast