The researchers tried replacing Facebook's algorithmic feed with one showing posts in reverse chronological order, without any algorithmic ranking, and reducing the number of reshared posts (the kind of content that goes viral).Īll of the changes to the algorithms had significant impacts on what users saw in their Facebook feeds. Two other experiments published on Thursday also tested changes to the algorithm that have been proposed by critics of Meta and policy makers. Users did, however, see more content from sources with different political leanings, as well as fewer posts from sources that repeatedly post misinformation. About 30% of those users were shown less content from like-minded sources and then researchers checked if that reduction changed their political attitudes. Technology Facebook Keeps Data Secret, Letting Conservative Bias Claims PersistĪfter establishing that baseline, the researchers ran an experiment, recruiting roughly 23,000 users who agreed to take part. (Facebook began showing users less news in 2018 and less political content in early 2021.) That said, both amount to a very small fraction of overall political news, which itself makes up just 3% of what people share on Facebook. Political news links posted by pages and in groups - not by friends - had even higher levels of audience segregation.Ĭonservatives are also the main consumers of websites that Facebook flagged as untrustworthy and links that third-party fact checkers flagged as inaccurate. The bubbles were asymmetric: there were more political news links seen exclusively by conservatives than by liberals. Conservatives engaged more with political news, meaning they clicked, liked, commented on, and re-shared the political news they saw more often than liberals did. The gap goes beyond the difference in what posts people see. "We'd need more data to see and we'd need more crises." "It's not a grand scientific observation," said David Lazar of Northeastern University, a co-author of the study. Politics Far-Right Misinformation Is Thriving On Facebook. While the researchers were able to tap large swaths of Facebook's tightly held user data, they had little direct insight about the inner workings of its algorithms. They investigated social media's role in the 2020 election by examining Facebook and Instagram before, during, and after Election Day. Four peer-reviewed studies, appearing in the journals Science and Nature, are the first results of a long-awaited, repeatedly delayed collaboration between Facebook and Instagram parent Meta and 17 outside researchers. New research published Thursday attempts to shed light on these questions. Is Facebook exacerbating America's political divide? Are viral posts, algorithmically ranked feeds, and partisan echo chambers driving us apart? Do conservatives and liberals exist in ideological bubbles online? New research about Facebook shows its impact on political polarization. A large video monitor on the campus of Meta, Facebook's parent company, in Menlo Park, Calif.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |