YouTube Shorts’ recommendation algorithm consistently shifts viewers away from politically sensitive content toward entertainment videos, according to new research from the University of Arkansas. The study examined how the platform’s algorithm influences content visibility and its potential impact on information diversity.
The research team, led by Mert Can Cakmak from the COSMOS Research Center at the University of Arkansas, found that the recommendation system exhibits a clear pattern of “content drift” — where videos recommended to users quickly diverge from the original topic, particularly when that topic involves complex or politically sensitive subject matter.
“Our results reveal a consistent drift away from politically sensitive content toward entertainment-focused videos,” the researchers write in their paper, which analyzed 685,842 videos across three content domains.
Immediate Relevance Drop After First Recommendation
One of the study’s most striking findings is how quickly the algorithm shifts away from political content. Videos related to the South China Sea dispute and the 2024 Taiwan presidential election showed high topical relevance at the starting point, but this relevance dropped sharply at the first level of recommendations, approaching near-zero values.
This immediate diversion occurs regardless of watch-time duration, suggesting that user engagement does not meaningfully impact the algorithm’s tendency to shift away from politically sensitive topics.
Entertainment Content
The research reveals that entertainment content dominates YouTube Shorts recommendations across all tested scenarios. In both politically sensitive datasets (the South China Sea and the Taiwan election), the initial seed videos were predominantly political, but entertainment content quickly became the majority beyond the first recommendation.
Image source: “Investigating Algorithmic Bias in YouTube Shorts”
When analyzing the general YouTube content dataset, entertainment was already dominant at the starting point and maintained this position across all recommendation depths. Political content within general recommendations dropped from an already small baseline (around 10%) to nearly zero, suggesting a systematic avoidance of political topics.
Emotional Tone
The study also examined the emotional tone across recommendations, classifying videos into five categories: joy/happiness, sadness, anger, neutral, and fear. In both politically sensitive datasets, the initial videos tended to express neutral or negative emotions, but the recommendations quickly shifted toward joyful or neutral content.
For the South China Sea topic, initial videos were dominated by neutral and angry emotional tones, likely reflecting the geopolitical tensions involved. By the first recommendation level, joy/happiness increased markedly, with corresponding declines in neutral and angry content.
This emotional shift aligns with the topical drift toward entertainment, suggesting the algorithm may be optimized to promote content with positive emotional tones.
Watch Duration
The researchers tested three watch-time conditions (3 seconds, 15 seconds, and 60 seconds) to investigate whether increased engagement might prevent content drift. Their findings show that longer watch durations did not prevent the shift away from political content.
In fact, full-duration viewing sometimes led to more sustained drift patterns, with periodic fluctuations in content types beyond the tenth recommendation. These patterns often coincided with the appearance of ads or sponsored content, suggesting that monetization may influence the sequencing of recommendations during longer viewing sessions.
Popularity Bias
Analysis of engagement metrics (views, likes, and comments) shows that the YouTube Shorts algorithm heavily favors highly viewed and liked videos. Engagement scores rise sharply after the first recommendation depth across all datasets and watch-time conditions.
This popularity bias creates a self-reinforcing cycle in which already popular content receives disproportionate promotion, potentially marginalizing niche or serious topics that may not generate the same level of engagement.
Implications for Creators and Viewers
For content creators focusing on news, politics, or educational topics, the study’s findings suggest significant challenges in maintaining audience reach and continuity through the platform’s recommendation system. Even when users engage fully with political content, the algorithm directs them toward entertainment videos.
“YouTube Shorts’ recommendation algorithm demonstrates a consistent shift away from politically sensitive or emotionally negative content, favoring high-engagement and emotionally positive videos,” the researchers conclude.
This algorithmic behavior could have important implications for information diversity and public discourse, particularly as YouTube Shorts now serves over 2 billion monthly users and has become a significant source of news for approximately 30% of Americans.
The study, titled “Investigating Algorithmic Bias in YouTube Shorts,” was conducted by researchers from the COSMOS Research Center at the University of Arkansas and the International Computer Science Institute at the University of California, Berkeley.
David Adler is an entrepreneur and freelance blog post writer who enjoys writing about business, entrepreneurship, travel and the influencer marketing space.
YouTube Shorts’ recommendation algorithm consistently shifts viewers away from politically sensitive content toward entertainment videos, according to new research from the University of Arkansas. The study examined how the platform’s algorithm influences content visibility and its potential impact on information diversity.
The research team, led by Mert Can Cakmak from the COSMOS Research Center at the University of Arkansas, found that the recommendation system exhibits a clear pattern of “content drift” — where videos recommended to users quickly diverge from the original topic, particularly when that topic involves complex or politically sensitive subject matter.
“Our results reveal a consistent drift away from politically sensitive content toward entertainment-focused videos,” the researchers write in their paper, which analyzed 685,842 videos across three content domains.
Immediate Relevance Drop After First Recommendation
One of the study’s most striking findings is how quickly the algorithm shifts away from political content. Videos related to the South China Sea dispute and the 2024 Taiwan presidential election showed high topical relevance at the starting point, but this relevance dropped sharply at the first level of recommendations, approaching near-zero values.
This immediate diversion occurs regardless of watch-time duration, suggesting that user engagement does not meaningfully impact the algorithm’s tendency to shift away from politically sensitive topics.
Entertainment Content
The research reveals that entertainment content dominates YouTube Shorts recommendations across all tested scenarios. In both politically sensitive datasets (the South China Sea and the Taiwan election), the initial seed videos were predominantly political, but entertainment content quickly became the majority beyond the first recommendation.
Image source: “Investigating Algorithmic Bias in YouTube Shorts”
When analyzing the general YouTube content dataset, entertainment was already dominant at the starting point and maintained this position across all recommendation depths. Political content within general recommendations dropped from an already small baseline (around 10%) to nearly zero, suggesting a systematic avoidance of political topics.
Emotional Tone
The study also examined the emotional tone across recommendations, classifying videos into five categories: joy/happiness, sadness, anger, neutral, and fear. In both politically sensitive datasets, the initial videos tended to express neutral or negative emotions, but the recommendations quickly shifted toward joyful or neutral content.
For the South China Sea topic, initial videos were dominated by neutral and angry emotional tones, likely reflecting the geopolitical tensions involved. By the first recommendation level, joy/happiness increased markedly, with corresponding declines in neutral and angry content.
This emotional shift aligns with the topical drift toward entertainment, suggesting the algorithm may be optimized to promote content with positive emotional tones.
Watch Duration
The researchers tested three watch-time conditions (3 seconds, 15 seconds, and 60 seconds) to investigate whether increased engagement might prevent content drift. Their findings show that longer watch durations did not prevent the shift away from political content.
In fact, full-duration viewing sometimes led to more sustained drift patterns, with periodic fluctuations in content types beyond the tenth recommendation. These patterns often coincided with the appearance of ads or sponsored content, suggesting that monetization may influence the sequencing of recommendations during longer viewing sessions.
Popularity Bias
Analysis of engagement metrics (views, likes, and comments) shows that the YouTube Shorts algorithm heavily favors highly viewed and liked videos. Engagement scores rise sharply after the first recommendation depth across all datasets and watch-time conditions.
This popularity bias creates a self-reinforcing cycle in which already popular content receives disproportionate promotion, potentially marginalizing niche or serious topics that may not generate the same level of engagement.
Implications for Creators and Viewers
For content creators focusing on news, politics, or educational topics, the study’s findings suggest significant challenges in maintaining audience reach and continuity through the platform’s recommendation system. Even when users engage fully with political content, the algorithm directs them toward entertainment videos.
“YouTube Shorts’ recommendation algorithm demonstrates a consistent shift away from politically sensitive or emotionally negative content, favoring high-engagement and emotionally positive videos,” the researchers conclude.
This algorithmic behavior could have important implications for information diversity and public discourse, particularly as YouTube Shorts now serves over 2 billion monthly users and has become a significant source of news for approximately 30% of Americans.
The study, titled “Investigating Algorithmic Bias in YouTube Shorts,” was conducted by researchers from the COSMOS Research Center at the University of Arkansas and the International Computer Science Institute at the University of California, Berkeley.
The full study is available here.
Checkout Our Latest Podcast