The Facebook bubble just popped. Half the country today is still in shock. Reality crashed down and many were presented with a world that didn’t match up with the one they’ve inhabited in the months leading up to the U.S. election.
As it turns out, that was Facebook’s world.
The social media network has become an outsize player in crafting our understanding of the events that take place around us. We’ve known for some time that its echo chamber could be an issue in terms of exposing us to differing viewpoints. But only today are some realizing how powerful its influence has become.
On Facebook, people were told the world was either a disaster, or seeing monumental progress. On Facebook, a Trump victory was likely, or a Clinton win was all but assured. On Facebook, the thoughts in your head turned into news articles you liked, turned into things you could share. On Facebook, everyone and no one could hear you scream.
And the louder we screamed, the more our time on the site increased. As did Facebook’s revenue.
Facebook didn’t just reflect your views back to you. It magnified and distorted them through the lens of sensational and often falsified stories. And it got away with it by throwing up its hands and claiming, “Hey, we’re not a media organization.“
Rather, it pretends to be a neutral platform where people can share whatever they like, within reason. Its teams of moderators police the site for content like pornography or the illegal sales of firearms or drugs, and other generally prohibited things. But beyond that, it turns a blind eye to the nature of the content within its walls.
It even went so far as to fire news editors who managed the Trends section, leaving the matter up to an impartial, but entirely fallible, algorithm. This wholesale elimination of human judgement from the site’s news machinery could not have come at a worse time for the election.
The algorithm later trended a number of stories that were “profoundly inaccurate,” according to a report that tracked the occurrences of fake news in this high-profile section of Facebook’s platform.
Facebook showed users a tabloid story saying 9/11 was an inside job, a false report that Fox News anchor Megyn Kelly was fired, a debunked story about how someone praying was kicked off a college campus. It even promoted a story about iPhone that works like an Aladdin’s lamp from a site whose name is “FakingNews.”
Facebook brushed asides these offenses as mistakes, claiming it would do better in the future.
But Facebook’s focus has been on making it easier for publishers to share on its network, not vetting their content. It invested in technological advances like Instant Articles which make news reading more painless with quick-loading pages free from burdensome scripts and ads. It works to figure out how to keep users on site for longer, so they can click on ever more personalized, targeted ads.
Of course, one way to increase engagement is to make people feel good when they arrive. And Facebook knows how to control your feelings, because it has studied this extensively.
The company in 2014 apologized for a research project where it manipulated the posts on 689,000 users’ home pages to see if it could make them feel more positive or negative emotions.
Turns out, it can.
People at the time said they were worried that Facebook could use the data it collected to figure out how to feed us a stream of happy thoughts to keep us on its site.
Sound familiar? It should.
The WSJ documented this in the realm of politics with its news visualization “Blue Feed, Red Feed. In other words, Facebook spoon feeds us what we want to hear, while minimizing our exposure to the opposing viewpoint.
The results have been beneficial for Facebook, to say the least. The network now has 1.79 billion monthly active users as of September, 2016. In its last quarter, it pulled in another $7 billion in revenue: $2.379 billion of that was profit, up 16 percent over the $2.05 billion it brought in last quarter, and up 160 percent year over year.
The problem of the Facebook bubble matters not just because we’ve been duped by the algorithms, but the significant role Facebook plays in the dissemination of news.
Today, a majority (62 percent) of U.S. adults get news on social media, which includes Facebook and other sites, a Pew Research study from May 2016 reported.
And Facebook is the largest social networking site, reaching 67 percent of U.S. adults.
Two-thirds of Facebook users (66 percent) get their news on the site — a figure that amounts to 44 percent of the general population, according to data from Pew Research. That’s up from 30 percent in 2014.
To make matters worse, social media is a poor platform for getting people to understand opposing views. Another Pew study found that only 20 percent of users modified their stance on a social or political issue because of what they saw on social media. A smaller 17 percent said they changed their views on a political candidate because of this.
When Pew then examined those changes in more detail, it found that social media had often pointed people in a more negative direction. That is, people who changed their minds on Clinton were more than three times as likely to have gone negative on her, and people who changed their minds on Trump were nearly five times as likely to have gone negative on him.
In addition, 82 percent of social media users said they never changed their minds on a candidate and 79 percent never changed their minds on a social or political issue because of social media. So in terms of convincing anyone of anything, Facebook wasn’t the place to do that.
We’ve known this about Facebook for some time, but many never felt it quite as profoundly as today. A personalized feed that tells you what you want to hear is great… until it’s not.
Last night and into this morning, people began to realize their sources had bad information; their data was wrong and the polls were off. And, most importantly, they discovered their crazy uncle wasn’t an outlier — he represented half of a very disturbed, very angry nation.