Too much bad press and too many confused users. That’s what Facebook got from silently tinkering with the news feed. BiltonGate was the last straw. Signs now point to Facebook starting to notify the public of news feed algorithm changes the way Google does for PageRank. Combined with education, Facebook could use transparency to try to convince people it tweaks the feed to help users, not sell more ads.
Facebook walks a tightrope. It has to balance the interests of marketers, content creators, app developers, its bottom line, and most of all, the user experience. Over time, its power, ubiquity, wealth and privacy issues have led people to jump to the worst conclusion at the first sight of change — that Facebook just wants to gobble up more private data and make more money.
That’s only half true. It definitely wants more data and money, but Facebook is a fundamentally mission-driven company. After reporting on Facebook for three years, interviewing dozens of employees, and knowing some from the San Francisco scene, I am heartily convinced it actually wants to make the world more open and connected.
Doing that means gobbling up private data and making money. The data goes to personalizing the service, and the money goes to hiring people to build it. But the real kicker is that the only way it can accomplish its mission and make a fortune is to generally prioritize the user experience. If it allows advertisers and viral app developers to overpower Facebook, a few years from now there won’t be a user base to advertise or sell apps to.
So when Facebook changes the news feed, it’s typically to the benefit of the end user. The problem is it often comes at the expense of Page admins and professional content creators like The New York Times’ Nick Bilton. Facebook alters its algorithm to show people news feed stories they’re more likely to Like, which can mean fewer stories from journalists and business pages searching for eyeballs.
Facebook’s done a poor job of communicating why it makes these changes, and it makes them silently. Both lead people to assume greed is responsible. Facebook had a huge issue with this over the summer. It started more intensely penalizing Pages for being marked as spam, and generally reduced the reach of Pages whose posts didn’t get many Likes per fan. Around the same time it introduced Promoted Posts, which let Pages pay to reach more people. People immediately accused Facebook of extorting Pages by reducing their reach unless they buy these Promoted Post ads. One blog post by Dangerous Minds (illustration to the right) got 162,000 Likes by calling Facebook “the biggest bait’n’switch in history.”
In reality, Facebook was going to beat back spammy Pages anyway, and the Promoted Posts feature is actually just an easier way to buy Sponsored Stories, which Facebook has had for a long time. Even if the algorithm change and ads were related, it’s not to Facebook’s benefit to make the feed worse on purpose. Otherwise people will read it less and, boom, any profits from Promoted Posts are offset by less usage.
The issue reared its head again last weekend when Bilton wrote a screed suggesting Facebook was showing his posts to fewer of his Twitter-style Facebook Subscribe followers than a year ago because it was showing more ads. Facebook admitted some public publishers are getting less engagement now, but claimed this was anecdotal, that users didn’t want to see Bilton’s posts, and that most public figures were getting more views now. But in the end Facebook came off sounding defensive, and, shall we say, unbecoming of one of the world’s most powerful companies.
If Facebook had been more proactive about communicating its fight against spam fighting and the changes to public-post reach, it wouldn’t have gotten itself into these messes. I sense Facebook has now realized this, and suspect it will soon begin announcing at least when it makes major news feed algorithm changes. As I’ve written about terms of service changes, “you always fear what you don’t understand” so Facebook’s goal will be to increase understanding.
It’s worked for Google. Every few months it publishes a post like “Search quality highlights: 65 changes for August and September,” outlining dozens of specific changes to PageRank so webmasters know to expect their Google search rank and traffic to fluctuate.
For example, the one change is listed in August was “nearby. [project 'User Context'] We improved the precision and coverage of our system to help you find more relevant local web results. Now we’re better able to identify web results that are local to the user, and rank them appropriately.” This lets non-local site admins know their traffic may drop because Google is giving users a better experience by directing them to relevant local results.
Facebook makes minor tweaks constantly, and big changes to the feed occasionally, like the one it’s making tomorrow announcing new content-specific news feeds. It could go with a live-updated list with more in-depth posts about the big changes. Alternatively, it could release scheduled dumps of changes like Google does, but also call out the major modifications. It would need to combine this with general education about why it makes the changes in the first place. Ideally, the tone would be something like Tron’s “I fight for the user!” but that doesn’t frame marketers, publishers, and developers as self-serving spammers.
Dissuading conspiracy theorists will be no easy task for Facebook. Many just inherently think corporations are evil. A transparency push will only succeed if Facebook can convey that for it to win, for businesses to win, its top priority is making itself a place where people love spending time.