Facebook’s ability to create filter bubbles, promote divisive content, and accelerate political polarization is no surprise to users who’ve kept up with the platform’s many scandals. But two new studies point to pitfalls with commonly proposed solutions and point to a troubling double bind for the 190 million Americans who rely on Facebook for news.
Encouraging users to sample other news sources, one study finds, can instead cause users to double down on their beliefs. Another found that logging off entirely reduces polarization but leaves people politically disengaged and disinterested. Sixteen years in, we’re still only beginning to understand how Facebook is shaping us.
Internally, the company has begun to quietly acknowledge the trade-off for users between staying informed and being algorithmically driven toward divisive content. But little has been done. Politicians remain focused on accusations of bias, and Facebook is busy proving it treats conservative and liberal users impartially. Executives routinely emphasize the matching Republican and Democratic criticism of the platform. If neither side is happy, they say, neither is being favored.
But conservative and liberal users have very different experiences when using Facebook. That’s not because of politically motivated decisions around what’s allowed on the platform. Rather, it reflects the way Facebook organizes information to reward “engaging” content. The focus on ferreting out bias obscures this and makes even practical solutions seem implausible.
While there’s little evidence to support that Facebook is biased against conservative users, University of Virginia professors Brent Kitchens and Steven Johnson found that, by maximizing for engagement and attention, Facebook’s algorithms actively push conservatives toward more radical content than liberal users. Kitchens and Johnson analyzed the news habits of over 200,000 users who agreed to share their browsing data. They found that Facebook pushed conservatives, unlike its moderate or liberal users, to read dramatically more radical content over time.
“All the platforms end up providing a sort of diversifying effect,” explains Kitchens, associate director of Virginia’s Center for Business Analytics. “If you’re reading more news from Facebook, you’re going to get a wider variety of news. But there’s also a polarizing effect. The diversity of information gets a little wider, but it also shifts more extreme.”
The study compared respondents’ use of Facebook, Reddit, and Twitter, with their news habits. Kitchens and Johnson created a numbered political spectrum of 177 news sites, with DailyKos and Salon furthest left, Breitbart and InfoWars furthest right, and USA Today around the center. In the months when conservative users were most active on Facebook, they read news sites that were far more conservative than their average, clicking links from InfoWars and Breitbart over staples like Fox News. By contrast, news consumption by liberal users shifted far less dramatically on the authors’ scale.
This polarizing effect of Facebook is in stark contrast to Reddit. When conservative users were most active on Reddit, they actually shifted to news sites the authors judged as more moderate than what they typically read. Kitchens and Johnson hypothesize that the most salient differences between Facebook and Reddit aren’t the content itself but how platforms structure and feed news and information to users.
“The impacts we’re seeing are by design. Facebook knows what’s going on with its platform,” says Johnson. “If it wanted to change it, it could.”
The authors identified a few major differences between Facebook and other sites. First, Facebook requires reciprocal friendship, which encourages a feed of like-minded people and reduces the chance of seeing opinion-challenging content. Facebook’s algorithms create feedback loops that perpetually show users what it thinks they want to see.
Second, Reddit has more anonymity than Facebook. Because users don’t necessarily have reciprocal bonds, people with different views can gather and share links in the same thread. Reddit’s algorithms prioritize interests, not friendship, and in the course of interactions on nonpartisan topics, the authors say there’s a much higher likelihood users will come across links to sites outside their typical news diet.