Is Facebook having a crisis of confidence over all the bad news its algorithms are making?

Is Facebook having —

  • A) An existential crisis
  • B) An attack of conscience
  • C) A mid-life crisis
  • D) None of the above?

Answers in the comments, please. Additional suggestions also welcome.

I ask because Facebook is surveying users to ask whether they think it cares about them.

Yes, it is literally using the word “cares”.

The survey, which Facebook says is being pushed to “a small group of people”, includes questions probing for users’ strength of feeling about Facebook (positive or negative), before asking them to elaborate on why they feel that way.

One possible response here reads: “It does not protect my personal information.”

Another: “It does not give me control over what I see on Facebook and who sees what I post.”

A third: “It has too many advertisements.”

Other questions ask respondents to select from a list which items they think best show Facebook “cares” about them, and vice versa, with pre-filled responses here including privacy controls and data security; user support/offensive content reporting; News Feed; and messages/chat, to name a few.

One amusingly meta question asks users to qualify where exactly they are directing their negative and/or positive feelings: Facebook as a company, as an app/site, or Facebook as a collection of other humans (“Facebook users”).

It’s a teeny glimpse into the quasi-psychological burden of being the go-to user generated content platform cum social communication network for one billion+ people all over the planet.

I’ve embedded a gallery of screenshots of some of the survey questions below, if you’re curious to see the full lists, and what else Facebook is asking (and how it asks it) — at least covering what’s asked if you follow the path of responding with a negative to its initial question about whether it cares.

[gallery ids="1392039,1392038,1392037,1392036,1392035,1392034,1392032"]

At the end of the survey Facebook says it will use the feedback to “help improve Facebook”.

And improve it sure needs to.

Multiple items in its list of possible responses immediately bring to mind a raft of recent Facebook-related controversies — whether it’s high level moves to further undermine user privacy; baffling content censorship decisions (yet tardiness combating the spread of hate speech); an ad infested News Feed that’s been guilty of algorithmically encouraging clickbait (and worse); and the largescale shuttering of mobile web access to Facebook messages to try to drive more downloads of Facebook’s Messenger app (at the expense of letting users access messages how they please — something TC columnist Jon Evans memorably dubbed “malevolent design“), to name just a very few.

Point is, when you’re a one billion+ user content sharing platform that’s principally engineered to algorithmically rank information to generate the most ad views from your users, while simultaneously leveraging your massive user-base to consume other media distribution channels, intent on becoming the primary means for all media distribution globally, well, you’re not going to be able to step away from the political hot seat. No matter many times you try to claim you’re just a tech tool, not a media company.

So really it’s hard to know which reputation damaging controversy Facebook is fretting about the most here. If indeed concern about negative PR is its motivation for running this survey. (I’ve asked Facebook for comment on its intentions with the poll and will update this post with any response.)

But here’s another possibility: Facebook is hoping to quantify whether Zuck’s personal brand of positive PR can beat back the tide of negative Facebook news.

Notably, one question in the survey makes specific reference to the past week — and asks whether a user’s opinion of Facebook has changed for the better, stayed the same or dialed down over this period.

Again there’s no mention of which specific event/events it’s fishing for a response to here. So what might Facebook be thinking about?

Let’s see…

Last Wednesday there was the Chan Zuckerberg Initiative’s announcement of a $3BN investment it said it intends to make over the next 10 years, with the headline-grabbing goal of ‘curing, preventing or managing all human disease’.

So perhaps Facebook is hoping its founder’s philanthropic side-line business, via the Limited Liability Company he and his wife announced as the vehicle for 99% of their Facebook stock to at the end of last year, can be used to pop a halo atop Facebook’s overall reputation — warming users’ hearts with the thought that the platform giant is not just destroying their privacy and micro-managing the world’s attention with woefully inept algorithms designed to sell ads rather than understand cultural context — it’s also trying to cure cancer!

In other words, hello $3BN worth of positive PR.

But how far one positive PR story — however ambitiously framed and generously self funded — can stretch to ‘reputation wash’ a tech giant whose command over information and attention spans has scaled so gigantic it has, in all likelihood, the democracy-crushing ability to swing votes and sway elections, as well as the amply demonstrated capacity to spread misinformation, hate speech and even spark or inflame large-scale violence remains to be seen.

A content platform giant that, in addition, continues to take a serious beasting for the terribly poor editorial decisions it makes to promote, demote or disappear entirely the information and human stories that flow across its servers. And for the knock on effect these decisions have on the populations and societies consuming increasing quantities of news via Facebook’s massive media portal.

But hey, just so long as you don’t call Facebook a media company!

(Oh and hey look! — the Chan Zuckerberg Initiative is currently hiring a chief spokesperson to ramp up the quantity of good news stories spinning out of the LLP, and eke more value out of the cash mountain of tax sheltered Facebook profits that funds it. “Experience in crisis communications” is one of the listed responsibilities for the role… )

Turns out a week is a very long time in Facebook negative headlines. Here’s just a few examples from a quick search of Facebook-related news this past week…

Terence Crutcher’s shooting was absent from Facebook Trends

Facebook VR founder secretly backed pro-Trump group

Furious Judge Lashes Out at Facebook for Letting Rookie Lawyer Handle Terrorism Case

Facebook post on Islam leads to Jordanian atheist writer Nahed Hattar’s murder

Facebook apologizes for disabling Palestinian journalists’ accounts

WHO DECIDES WHEN A PROTEST BECOMES A FACEBOOK “DISASTER”?

Ole Miss condemns racially charged Facebook post after student protest

Steve Clevenger Facebook post likened Black Lives Matter movement to KKK

Facebook posts leads to huge fight with multiple arrests

Alleged Charlotte Protest Killer Flashed Guns on Facebook

Facebook is expanding its campaign to combat hate speech

I Let Facebook’s Algorithms Run My Life For Weeks

Facebook’s Bad Year Just Got a Lot Worse

And just in case you’re thinking this week might be an outlier on the Facebook bad news front, the week before wasn’t exactly free of PR-headaches for the tech giant either…

e.g.

Facebook to roll out tech for combating fake stories in its Trending topics

Europe’s top court to weigh in on Facebook privacy ‘class action’

Facebook employees say deleting ‘napalm girl’ photo was a mistake

etc, etc.

Safe to say, there is an increasingly shrill tone to Zuckerberg’s protestations that the global content platform he commands, which pumps a daily digest of news, information and entertainment into the screens on its 1BN+ users, is not a media company.

“We are a tech company, not a media company,” he said recently, responding to a journalist’s question about whether Facebook intends to become a news editor.

“We build the tools, we do not produce any content,” he added.

Well, either he’s never heard the phrase the medium is the message. Or he’s intentionally dodging the bullet and watching as it slams into the body politic.

Facebook is very obviously not just a tech tool, and to claim otherwise is disingenuous and downright dangerous — as others have eloquently argued. The mega platform that Facebook now commands is not just dabbling with data, it’s shaping people’s lives and lived experience.

Move fast and break things may not be the official company motto anymore, but Facebook needs to do a lot more than run the odd concerned user survey if it wants to demonstrate it’s outgrown its previously stated cavalier attitude to the wider repercussions of its actions.

If Facebook really wants to show it cares about the people subject to the power of its platform then it needs to stop trying to engineer around the one thing it can’t shirk: editorial responsibility. And recognize that it needs human editors, not just algorithms and moderators.

Here’s another phrase Facebook should really ink onto its whiteboards: the personal is political.

Otherwise Zuck can carry on funneling a portion of his vast personal wealth into feel-good philanthropy projects in the hopes a halo glow of enough positive headlines will be able to outweigh the constant stream of negative news underlining quite how much power Facebook is irresponsibly wielding.

But does he really think curing cancer is easier than accepting editorial responsibility?

One last phrase for Facebook to chew over: Power without responsibility.