Facebook realizes that “fake news” is a problem, but is still a long way from figuring out how to solve it. At CODE Media today, Facebook VP of partnerships Dan Rose said combatting fake news is “something that’s really important to us,” but acknowledged that the company is “just getting started” and “there’s a lot of work we can do.”
In a world where people’s perceptions of reality are being shaped largely by what appears in their social feeds, some people have argued that companies like Facebook have a responsibility to ensure they’re not being fed a steady diet of “fake news.”
Facebook is moving trying to reduce the impact of sensational or misleading content showing up in users’ news feeds. Part of that is a reaction to the election of Donald Trump following the viral spread of fake election news via Facebook, which is basically the digital equivalent of shutting the barn door after the horse has bolted.
But part of it is by necessity, as other nations are pressuring Facebook to clamp down on the spread of news that could be disruptive to their own elections.
In Germany, which is facing a nationwide election this year, lawmakers proposed a rule that would levy a 500,000 euro fine for each piece of fake news it fails to take down within 24 hours. No surprise, then, that Germany was one of the first places where Facebook rolled out new tools to combat misleading articles being shared.
Rose highlighted the steps the company has already taken to alleviate the problem, which include its effort to work with third-party fact checkers to identify misleading headlines and untrue stories. It’s also providing better tools to allow users to flag content in their feeds, and sending those stories to third-party fact checkers.
Part of Facebook’s reluctance to censoring what people shares on their feeds might come from a philosophical standpoint of whether the company is, in fact, a media company. On that point, Rose demurred, saying that Facebook is a “new type of platform… where people discover a lot of media content.”