It’s true, Facebook isn’t single-handedly responsible for the outcome of the U.S. presidential election. But the company played an important part in the rise of Donald Trump. Many people now think Facebook has to find a way to flag fake news appropriately before the next election in 2020. But it’s a much more urgent issue.
Facebook has to weed out fake news before the Dutch, French, Kenyan, Chilean, and many, many other elections in 2017. These elections could see populist candidates triumph once again.
The company shouldn’t just question its responsibility when there’s an election in the U.S. Given Facebook’s penetration rate around the world, Facebook is now in charge of propagating news in most countries.
I’m tired of seeing Facebook burying its head in the sand on this issue. Instead of issuing multiple half-hearted statements saying that “everything is fine,” Facebook should be open and transparent about this issue.
It doesn’t matter if Facebook thinks that everything is fine and well. Millions of people now think that Facebook could do a better job when it comes to news articles in the news feed. So the company should work on it. It seems like a reasonable thing to ask as this is exactly what technology companies are supposed to do. They iterate, improve their products over time and add new features.
But the worst part is that Facebook’s incentive aren’t directly aligned with everyone’s incentive. Fake news articles spread like a virus. They create a lot of engagement. They improve the number of daily active users, which leads to more ad engagement, which leads to better quarterly earnings.
And yet, it doesn’t matter if Facebook’s quarter is awesome thanks to the presidential election if the company becomes irrelevant in five years. Facebook has to act now to protect its long-term reputation.
Even more important, I’d even say that this issue could greatly damage the economy in many countries, which could have indirect effects on the company’s bottom line. Facebook tends to boost the so-called anti-establishment candidates. Many of them are just populists who want to destroy existing policies in favor of alternative policies that exclude immigrants, minorities and demolish the welfare state. Dividing people into groups and pitting them against each other will never lead to systemic economic growth.
And because of Facebook’s algorithm, populist candidates with a vocal fanbase can use Facebook as a megaphone to convert new people. It doesn’t matter if something is true or false. Many people are now looking for articles that will validate their views and share them ad nauseam. That’s why it’s important to fix the megaphone.
“But free speech!” you might say. Facebook isn’t a balanced platform in the first place. You choose your friends on Facebook, and they tend to validate your opinion. After Trump’s election, many of my friends told me something along the lines of “I thought my liberal bubble was bigger than it actually is and I blame Facebook.” How many people did you unfriend because they shared some nonsensical article about Hillary Clinton or Donald Trump?
Given that most people share articles without even reading these articles in the first place, Facebook should at least flag articles when they’re fake so that people know what they’re dealing with.
I’m not saying that Facebook should stop you from posting all sorts of articles. People should be able to share cute kitten videos as well as conspiracy theories. But if people don’t even click on the links before sharing them, Facebook should put a question mark and say “you should think twice before sharing this.” That’s the least the company can do.
This issue isn’t just affecting Facebook. It turns out that Facebook is the bigger player in this space, so the focus is on Facebook right now. But Twitter, Google search results, YouTube and all social networks should take a minute and think about what they’re going to do over the next few months to fix their platforms.
I can’t help but feel a sense of urgency around Facebook’s plague of fake news. If Facebook can’t fix its platform, France will end up with Marine Le Pen. In other words, no, Facebook, it’s not time to issue 18 different statements saying that sure, maybe one day Facebook will consider thinking about someday drafting a new feature that will reduce the reach of fake news. It’s time to act now before all world leaders are demagogues.
It all comes down to a simple question — is Facebook a technology company or a media company? When the company fired its human editors for Trending Topics, Facebook made a clear choice in favor of algorithms.
But by putting the algorithms in charge, Facebook unintentionally made an editorial decision. Facebook doesn’t get to decide if it is a media company or not. It gets to set its own editorial rules and enforce them using a combination of algorithms and humans.