X's "For You" Algorithm Shifts Users' Political Opinions Rightward, New Study Finds
A new study published in Nature reveals that X's default "For You" algorithm shifts users' political opinions in a more conservative direction. Led by Germain Gauthier from Bocconi University, this research represents a real-world randomized experimental study on a major social media platform, providing crucial insights into algorithmic influence.
The Study: Design and Execution
The study involved 4,965 active US-based X users, randomly assigned to one of two groups for seven weeks during 2023. The first group used X's default "For You" feed, which employs an algorithm to select and rank posts for user engagement, including content from accounts not followed. The second group used a chronological feed, displaying posts only from followed accounts in posting order. This dual-group approach allowed researchers to directly compare the impact of algorithmic vs. non-algorithmic feeds.
Algorithmic Influence: Key Findings
The research uncovered significant shifts in political opinion among users exposed to the algorithmic "For You" feed:
- Political Prioritization: Users who switched from the chronological feed to the "For You" feed showed a 4.7 percentage point increase in prioritizing policy issues favored by US Republicans, such as crime, inflation, and immigration.
- Views on Trump Investigation: These users were also more likely to view the criminal investigation into US President Donald Trump as unacceptable.
- International Relations: Additionally, they demonstrated a shift towards a more pro-Russia stance regarding the war in Ukraine, with a 7.4 percentage point decrease in positive views of Ukrainian President Volodymyr Zelenskyy and slightly higher scores on a pro-Russian attitude index.
Unpacking the Mechanisms
Researchers identified the specific mechanisms driving these observed effects. The algorithm increased the share of right-leaning content by 2.9 percentage points overall (and 2.5 points among political posts) compared to the chronological feed. Critically, it significantly demoted posts from traditional news organizations while promoting content from political activists. This selective amplification and demotion directly shaped the information environment for users.
A Lasting Impact on Engagement
A longer-term effect observed was the algorithm's influence on user following patterns. The study indicated that the algorithm encouraged users to follow more right-leaning accounts. Importantly, these new following patterns persisted even after users switched back to the chronological feed. This suggests a lasting impact beyond daily feed exposure, fundamentally altering a user's long-term information consumption habits.
Echoes from Prior Research
These findings align with prior research, underscoring a consistent pattern of algorithmic bias on the platform:
- A 2022 study, conducted before X's rebranding, found that the platform's algorithmic systems amplified mainstream political right content more than left content in six out of seven countries.
- Another experimental study in 2025 re-ranked X feeds to reduce exposure to antidemocratic attitudes and partisan animosity, resulting in a shift in feelings towards political opponents by over two points on a 0–100 "feeling thermometer," a change estimated to take three years organically.
- Further research analyzed engagement data from political accounts during the 2024 US election, noting an increase in engagement with Elon Musk's account after his endorsement of Trump on July 13, leading to greater visibility for right-leaning accounts.
The Broader Implications and Call for Transparency
With over 400 million global users, X functions as a critical communication infrastructure, playing an increasingly central role in shaping public discourse.
The algorithms operating within such platforms are described as an editorial force, influencing public information, attention, and beliefs.
There is a growing call for mandated transparency from governments globally regarding how these algorithmic systems function, with reference to authorities like Australia's eSafety Commissioner. The findings of this Nature study add significant weight to these demands, highlighting the profound and often unseen impact of algorithmic curation on societal attitudes and political landscapes.