Facebook and Instagram users are exposed to significantly different political news based on their own political beliefs, according to new research conducted by Meta, the parent company of the social media platforms. The study, which involved more than a dozen outside academics and analyzed data from around 208 million US-based active users, found that users classified as “conservative” consumed a much higher proportion of politically misleading or false news compared to their liberal counterparts.
This finding reinforces the belief that algorithmically ranked news feeds contribute to the polarization of political beliefs. To investigate this claim, researchers replaced the algorithmically generated feeds on Facebook and Instagram with chronological feeds for some participants. The results showed that while the chronological feeds did present more moderate content, they also increased exposure to both political and untrustworthy content compared to the algorithmic feeds. Additionally, the change in feed type did not lead to any statistically significant difference in users’ political participation.
Another study conducted as part of the research partnership focused on the impact of removing reshared content from user feeds. While this did decrease the presence of political and untrustworthy news sources, it did not have any effect on polarization. However, it did lead to a decrease in overall news knowledge among participants.
The findings of these studies highlight the complex nature of algorithmic content ranking and its influence on user behavior. Although removing reshared content may reduce the visibility of misleading news, it also reduces exposure to content from trustworthy sources. It is clear that algorithms play a significant role in shaping users’ experiences on social media platforms, but the exact mechanisms behind this influence are still not fully understood.
Facebook and Meta have faced criticism in the past for their handling of algorithmic content ranking and its potential impact on polarization. This research partnership with external academics serves as an attempt by Meta to address these concerns and gain a better understanding of the relationship between social media and political beliefs. However, it is important to note that the data used in the studies was provided by Meta, which raises questions about the independence and objectivity of the research.
In response to the findings, Meta’s president of global affairs, Nick Clegg, emphasized that there is little evidence to suggest that social media causes harmful polarization or significantly impacts political attitudes and behaviors. He also highlighted the need for transparency and accountability in algorithmic content ranking.
While social media may not be the primary driver of polarization, previous research has shown that it can amplify existing divisions. Taking a break from Facebook, for example, has been found to lessen polarization of views on policy issues, but not in a statistically significant way. This suggests that social media’s impact on polarization is more complex and multifaceted than can be captured by short-term studies.
The research conducted by Meta and its academic partners is ongoing, with a total of 16 papers expected to be published. The lead academics of the partnership highlighted the need for further investigation into why algorithmic content ranking has such a profound influence on user experiences and beliefs. It remains to be seen how the findings of this research will inform the future development of social media platforms and their algorithms.