Deep dive into Meta’s algorithms shows that America’s political polarization has no easy fix
The powerful algorithms used by Facebook and Instagram to deliver content to users have increasingly been blamed for amplifying misinformation and political polarization. But a series of groundbreaking studies published Thursday suggest addressing these challenges is not as simple as tweaking the platforms’ software.
The four research papers, published in Science and Nature, also reveal the extent of political echo chambers on Facebook, where conservatives and liberals rely on divergent sources of information, interact with opposing groups and consume distinctly different amounts of misinformation.
Algorithms are the automated systems that social media platforms use to suggest content for users by making assumptions based on the groups, friends, topics and headlines a user has clicked on in the past. While they excel at keeping users engaged, algorithms have been criticized for amplifying misinformation and ideological content that has worsened the country’s political divisions.
Proposals to regulate these systems are among the most discussed ideas for addressing social media’s role in spreading misinformation and encouraging polarization. But when the researchers changed the algorithms for some users during the 2020 election, they saw little difference.
“We find that algorithms are extremely influential in people’s on-platform experiences and there is significant ideological segregation in political news exposure,” said Talia Jomini Stroud, director of the Center for Media Engagement at the University of Texas at Austin and one of the leaders of the studies. “We also find that popular proposals to change social media algorithms did not sway political attitudes.”
While political differences are a function of any healthy democracy, polarization occurs when those differences begin to pull citizens apart from each other and the societal bonds they share. It can undermine faith in democratic institutions and the free press.
Significant division can undermine confidence in democracy or democratic institutions and lead to “affective polarization,” when citizens begin to view each other more as enemies than legitimate opposition. It’s a situation that can lead to violence, as it did when supporters of then-President Donald Trump attacked the U.S. Capitol on Jan. 6, 2021.
To conduct the analysis, researchers obtained unprecedented access to Facebook and Instagram data from the 2020 election through a collaboration with Meta, the platforms’ owners. The researchers say Meta exerted no control over their findings.
When they replaced the algorithm with a simple chronological listing of posts from friends—an option Facebook recently made available to users—it had no measurable impact on polarization. When they turned off Facebook’s reshare option, which allows users to quickly share viral posts, users saw significantly less news from untrustworthy sources and less political news overall, but there were no significant changes to their political attitudes.
Likewise, reducing the content that Facebook users get from accounts with the same ideological alignment had no significant effect on polarization, susceptibility to misinformation or extremist views.
Together, the findings suggest that Facebook users seek out content that aligns with their views and that the algorithms help by “making it easier for people to do what they’re inclined to do,” according to David Lazer, a Northeastern University professor who worked on all four papers.
Eliminating the algorithm altogether drastically reduced the time users spent on either Facebook or Instagram while increasing their time on TikTok, YouTube or other sites, showing just how important these systems are to Meta in the increasingly crowded social media landscape.
In response to the papers, Meta’s president for global affairs, Nick Clegg, said the findings showed “there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization or has any meaningful impact on key political attitudes, beliefs or behaviors.”
Katie Harbath, Facebook’s former director of public policy, said they showed the need for greater research on social media and challenged assumptions about the role social media plays in American democracy. Harbath was not involved in the research.
“People want a simple solution and what these studies show is that it’s not simple,” said Harbath, a fellow at the Bipartisan Policy Center and the CEO of the tech and politics firm Anchor Change. “To me, it reinforces that when it comes to polarization, or people’s political beliefs, there’s a lot more that goes into this than social media.”
One organization that’s been critical of Meta’s role in spreading misinformation about elections and voting called the research “limited’ and noted that it was only a snapshot taken in the midst of an election, and didn’t take into account the effects of years of social media misinformation.
Free Press, a non-profit that advocates for civil rights in tech and media, called Meta’s use of the research “calculated spin.”
“Meta execs are seizing on limited research as evidence that they shouldn’t share blame for increasing political polarization and violence,” Nora Benavidez, the group’s senior counsel and director of digital justice and civil rights said in a statement. “Studies that Meta endorses, which look piecemeal at narrow time periods, shouldn’t serve as excuses for allowing lies to spread.”
The four studies also revealed the extent of the ideological differences of Facebook users and the different ways that conservatives and liberals use the platform to get news and information about politics.
Conservative Facebook users are more likely to consume content that has been labeled misinformation by fact-checkers. They also have more sources to choose from. The analysis found that among the websites included in political Facebook posts, far more cater to conservatives than liberals.
Overall, 97% of the political news sources on Facebook identified by fact-checkers as having spread misinformation were more popular with conservatives than liberals.
The authors of the papers acknowledged some limitations to their work. While they found that changing Facebook’s algorithms had little impact on polarization, they note that the study only covered a few months during the 2020 election, and therefore cannot assess the long-term impact that algorithms have had since their use began years ago.
They also noted that most people get their news and information from a variety of sources—television, radio, the internet and word-of-mouth—and that those interactions could affect people’s opinions, too. Many in the United States blame the news media for worsening polarization.
To complete their analyzes, the researchers pored over data from millions of users of Facebook and Instagram and surveyed specific users who agreed to participate. All identifying information about specific users was stripped out for privacy reasons.
Lazer, the Northeastern professor, said he was at first skeptical that Meta would give the researchers the access they needed, but was pleasantly surprised. He said the conditions imposed by the company were related to reasonable legal and privacy concerns. More studies from the collaboration will be released in coming months.
“There is no study like this,” he said of the research published Thursday. “There’s been a lot of rhetoric about this, but in many ways the research has been quite limited.”
More information:
Andrew M. Guess, How do social media feed algorithms affect attitudes and behavior in an election campaign?, Science (2023). DOI: 10.1126/science.abp9364. www.science.org/doi/10.1126/science.abp9364
Sandra González-Bailón, Asymmetric ideological segregation in exposure to political news on Facebook, Science (2023). DOI: 10.1126/science.ade7138. www.science.org/doi/10.1126/science.ade7138
Andrew M. Guess, Reshares on social media amplify political news but do not detectably affect beliefs or opinions, Science (2023). DOI: 10.1126/science.add8424. www.science.org/doi/10.1126/science.add8424
Brendan Nyhan, Like-minded sources on Facebook are prevalent but not polarizing, Nature (2023). DOI: 10.1038/s41586-023-06297-w. www.nature.com/articles/s41586-023-06297-w
© 2023 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.
Citation:
Deep dive into Meta’s algorithms shows that America’s political polarization has no easy fix (2023, July 29)
retrieved 29 July 2023
from https://phys.org/news/2023-07-deep-meta-algorithms-america-political.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
For all the latest Science News Click Here
For the latest news and updates, follow us on Google News.