How Facebook can escape the echo chamber

Facebook may have built an influence so large that it’s cracking under the weight of its News Feed.

Mark Zuckerberg began an interview on stage at Techonomy16 discussing the evolution of the News Feed and Facebook’s impact on the election. Post-election, journalists, politicians, and pundits have questioned Facebook’s role in shaping the campaign and its outcome, debating the merits of Facebook’s position of primacy as a source of information.

screen-shot-2016-11-10-at-7-18-27-pm

Zuckerberg defended the News Feed’s progress arguing that the filter bubble isn’t an issue for Facebook. He suggested the real problem is that people by nature engage with content they like and find agreeable, and dismiss things they don’t agree with online as they would in real life.

“You’d be surprised at how many things we dismiss,” he said. “The problem isn’t that the diverse information isn’t there…but that we haven’t gotten people to engage with it in higher proportions.”

What are the tools that could help us escape the echo chamber?

If Facebook won’t change its algorithms for fear of going against its wildly successful revenue model or expand its Trending Topics product, it needs to implement better features to help diversify the content we see.

  • First, Facebook should hire human journalists to curate stories during elections. They should pick the best stories from a variety of sources and perspectives and flag them on Facebook as good quality and worthy of reading. Also: Fact checking. Google did it. Now it’s Facebook’s turn. 
  • Since the personalized News Feed favors what we engage with, and we tend to engage with content we agree with, Facebook should provide an option to turn this off during elections to allow people to see algorithm-free, real-time content.
  • Imagine being able to activate a filter that would show you what your Facebook-specified Republican or Democratic or Libertarian etc. friends were sharing.
  • Facebook could create a feature that allows people to declare endorsement for a candidate, and users could then build a feed to see what that pool of friends was posting about as well as the conversation around their posts.
  • Facebook could curate and flag certain content as partisan, and those stories could appear with a link to an Instant Article of the opposing viewpoint or from an opposing news source (however, since not every issue is purely partisan — and not every news source either — this could get tricky).
  • Trending Topics should be expanded and should display more takes on political stories, not just what the highest number of people are talking about.
  • Facebook could use the “Suggested Videos” window that pops up when you watch a video to its completion to surface opposing viewpoints.
  • Facebook could show a post from a candidate on the opposing side whenever a politician posts from their account.

It’s irresponsible for Facebook to give people such a powerful megaphone for personal expression, only to lock them inside an echo chamber.

 

Facebook is hiding behind its “we’re a tech company, not a media company” guise in an effort to excuse itself from the fact that it hasn’t figured out the news. For such an influential platform that preaches social responsibility and prioritizes user experience, it’s irresponsible for Facebook to give people such a powerful megaphone for personal expression, only to lock them inside an echo chamber.

Despite what Zuckerberg claims, Facebook profoundly affected the way the U.S. consumed the election, just as it has shaped our news experience whether it wants to our not.

I don’t recommend using Facebook as a sole news source. But 44 percent of adults in the U.S. use Facebook as a source for news, a Pew report detailed earlier this year. Another study found that Facebook saw an increase of almost 30 percent on election night compared to a typical evening.

It’s safe to say that a solid number of people were banking on Facebook for election updates, live video and as a stage for their own social commentary.

Is this all a pipe dream?

If Facebook routinely showed users things they found distasteful or viewed as incorrect, its audience wouldn’t want to use it as much. Facebook’s revenue model profits (you know, that cool $7B revenue in Q3) from a strategy of making its 1.79 billion users feel validated (and more likely to engage) with its personalized algorithm.

It wants to keep us in a bubble of comfort where our views are repeated back to us in the News Feed. So yes, Facebook makes money by algorithmically favoring content that affirms our opinions. Why would it want to change? And are people even ready for a fair Feed? With its massive influence, Facebook may have the ability to change this by offering both sides.

What is Facebook currently doing?

Facebook has offered lip service about breaking out of the echo chamber. Its data was used in the Wall Street Journal’s Blue Feed, Red Feed experiment to juxtapose a liberal Facebook and conservative Facebook feed sourced from users’ self-proclaimed political views and what they shared.

This year, Facebook published an odd video in a lackluster plea for us to play nice this election season, offering up its search bar as a resource to discover new viewpoints (Facebook’s search may be the least useful function on the platform). Its election Hub was a hands-on guide for information about the election aimed at helping people learn about candidates, policy and ballot propositions.

It also supposedly helped over 2 million people become registered voters. But the way users are interacting with the ‘lean back’ News Feed experience is important too.

Facebook is missing a huge opportunity to use its tech to help us see content through a more bipartisan lens during this politically divided time in U.S. history when it could also potentially change our proclivity to ignore the other side. As a company that has always prioritized the user experience, Facebook could be doing much more.