Facebook content moderators demand safer working conditions

A group of more than 200 Facebook content moderators, as well as some full-time employees,* are demanding the tech company “stop needlessly risking moderators’ lives,” they wrote in an open letter to Facebook and the company’s contractors that manage content moderators, Accenture and Covalen. This comes after The Intercept reported how some Facebook content moderators — who deal with things like sexual abuse and graphic violence — were required to come back into the office during the pandemic. Shortly after they returned to the office, a Facebook content moderator reportedly tested positive for COVID-19.

“After months of allowing content moderators to work from home, faced with intense pressure to keep Facebook free of hate and disinformation, you have forced us back to the office,” the group wrote. “Moderators who secure a doctors’ note about a personal COVID risk have been excused from attending in person.[1] Moderators with vulnerable relatives, who might die were they to contract COVID from us, have not.”

Moderators are now demanding Facebook allow those who are high-risk or live with someone who is high-risk for having a severe case of COVID-19 to be able to work from home indefinitely. Additionally, moderators generally want Facebook to maximize the amount of work people can do from home.

“You have previously said content moderation cannot be performed remotely for security reasons,” they wrote. “If that is so, it is time to fundamentally change the way that the work is organized. There is a pervasive and needlessly secretive culture at Facebook. Some content, such as content that is criminal, may need to be moderated in Facebook offices. The rest should be done at home.”

They also want Facebook to offer hazard pay, offer healthcare and psychiatric care and employ moderators rather than outsource them.

“We appreciate the valuable work content reviewers do and we prioritize their health and safety,” a Facebook spokesperson told TechCrunch in a statement. “While we believe in having an open internal dialogue, these discussions need to be honest. The majority of these 15,000 global content reviewers have been working from home and will continue to do so for the duration of the pandemic. All of them have access to health care and confidential wellbeing resources from their first day of employment, and Facebook has exceeded health guidance on keeping facilities safe for any in-office work.”

Update 11/19: Facebook has since said it’s “not able to route some of the most sensitive and graphic content to outsourced reviewers at home,” its VP of Integrity Guy Rosen said on a press call. “This is really sensitive content. This is not something you want people reviewing from home with their family around.”

In the letter, moderators argue that Facebook’s algorithms are nowhere near where they need to be in order to successfully moderate content. They argue they’re “the heart” of Facebook.

“Without our work, Facebook is unusable,” the moderators wrote. “Its empire collapses. Your algorithms cannot spot satire. They cannot sift journalism from disinformation. They cannot respond quickly enough to self-harm or child abuse. We can.”

The group represents content moderators in throughout the U.S. and Europe and has support from legal advocacy firm Foxglove. Foxglove said in a tweet that it’s the “biggest joint international effort of Facebook content moderators yet.”

This post has been updated to reflect that full-time Facebook employees are also demanding these changes in solidarity with content moderators.