Facebook’s self-styled ‘oversight’ board selects first cases, most dealing with hate speech

Image Credits: Jakub Porzycki/NurPhoto / Getty Images

A Facebook-funded body that the tech giant set up to distance itself from tricky and potentially reputation-damaging content moderation decisions has announced the first bundle of cases it will consider.

In a press release on its website the Facebook Oversight Board (FOB) says it sifted through more than 20,000 submissions before settling on six cases — one of which was referred to it directly by Facebook.

The six cases it’s chosen to start with are:

Facebook submission: 2020-006-FB-FBR

A case from France where a user posted a video and accompanying text to a COVID-19 Facebook group — which relates to claims about the French agency that regulates health products “purportedly refusing authorisation for use of hydroxychloroquine and azithromycin against COVID-19, but authorising promotional mail for remdesivir”; with the user criticizing the lack of a health strategy in France and stating “[Didier] Raoult’s cure” is being used elsewhere to save lives”. Facebook says it removed the content for violating its policy on violence and incitement. The video in questioned garnered at least 50,000 views and 1,000 shares.

The FOB says Facebook indicated in its referral that this case “presents an example of the challenges faced when addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic”.

User submissions:

Out of the five user submissions that the FOB selected, the majority (three cases) are related to hate speech takedowns.

One case apiece is related to Facebook’s nudity and adult content policy; and to its policy around dangerous individuals and organizations.

See below for the Board’s descriptions of the five user submitted cases:

Public comments on the cases can be submitted via the FOB’s website — but only for seven days (closing at 8:00 Eastern Standard Time on Tuesday, December 8, 2020).

The FOB says it “expects” to decide on each case — and “for Facebook to have acted on this decision” — within 90 days. So the first ‘results’ from the FOB, which only began reviewing cases in October, are almost certainly not going to land before 2021.

Panels comprised of five FOB members — including at least one from the region “implicated in the content” — will be responsible for deciding whether the specific pieces of content in question should stay down or be put back up.

Facebook’s outsourcing of a fantastically tiny subset of content moderation considerations to a subset of its so-called ‘Oversight Board’ has attracted plenty of criticism (including inspiring a mirrored unofficial entity that dubs itself the Real Oversight Board) — and no little cynicism.

‘The Real Facebook Oversight Board’ launches to counter Facebook’s ‘Oversight Board’

Not least because it’s entirely funded by Facebook; structured as Facebook intended it to be structured; and with members chosen via a system devised by Facebook.

If it’s radical change you’re looking for, the FOB is not it.

Nor does the entity have any power to change Facebook policy — it can only issue recommendations (which Facebook can choose to entirely ignore).

Its remit does not extend to being able to investigate how Facebook’s attention-seeking business model influences the types of content being amplified or depressed by its algorithms, either.

And the narrow focus on content taken downs — rather than content that’s already allowed on the social network — skews its purview, as we’ve pointed out before.

So you won’t find the board asking tough questions about why hate groups continue to flourish and recruit on Facebook, for example, or robustly interrogating how much succour its algorithmic amplification has gifted to the antivaxx movement.  By design, the FOB is focused on symptoms, not the nation-sized platform ill of Facebook itself. Outsourcing a fantastically tiny subset of content moderations decisions can’t signify anything else.  

With this Facebook-commissioned pantomime of accountability the tech giant will be hoping to generate a helpful pipeline of distracting publicity — focused around specific and ‘nuanced’ content decisions — deflecting plainer but harder-hitting questions about the exploitative and abusive nature of Facebook’s business itself, and the lawfulness of its mass surveillance of Internet users, as lawmakers around the world grapple with how to rein in tech giants.  

The company wants the FOB to reframe discussion about the culture wars (and worse) that Facebook’s business model fuels as a societal problem — pushing a self-serving ‘fix’ for algorithmically fuelled societal division in the form of a few hand-picked professionals opining on individual pieces of content, leaving it free to continue defining the shape of the attention economy on a global scale. 

France and the Netherlands signal support for EU body to clip the wings of big tech

Latest Stories