A year ago, The Wall Street Journal revealed that Facebook operated a two-tiered content moderation system. Normal users were subject to the platform’s stated rules, while VIP users were secretly flagged into a special program internally called “cross check.”
That list included everyone from Brazilian soccer star Neymar and former President Donald Trump to conservative commentator Candace Owens and the company’s founder, Mark Zuckerberg. Per the WSJ, that system was designed to minimize instances in which Facebook might moderate content from a VIP in the normal course of moderation and kick off a firestorm of bad press in the process.
“If Facebook’s systems conclude that one of those accounts might have broken its rules, they don’t remove the content—at least not right away, the documents indicate,” the WSJ reported. “They route the complaint into a separate system, staffed by better-trained, full-time employees, for additional layers of review.”
Cross-check came to light in mid September of last year, and by the end of the month the company was asking the Oversight Board, Meta’s semi-independent policy-making council, to review the system and suggest ways to fix it. “Specifically, we will ask the board for guidance on the criteria we use to determine what is prioritized for a secondary review via cross-check, as well as how we manage the program,” Meta VP of Global Affairs Nick Clegg wrote.
The Oversight Board is now back with their recommendations, calling for “significant improvements” to the cross-check program.
“For years, cross-check allowed content from a select group of politicians, business partners, celebrities, and others to remain on Facebook and Instagram for several days when it would have otherwise been removed quickly,” the group wrote in a blog post, noting that some content that fell under cross-check has remained up for seven months before the company made a decision about whether to remove it.
The Oversight Board offered 32 recommended changes to that process, including a few steps that would make a previously secret program much more transparent. The board called on the company to publish “clear criteria” describing what accounts are eligible for cross-check’s extra review process, to visibly mark accounts that are in the program and to allow people who might meet the requirements to apply for the special account status.
The board also requested that Meta prioritize “users who are likely to produce expression that is important for human rights” like journalists and civil rights groups in the cross-check system rather than making those calls based on its business interests. “While the number of followers can indicate public interest in a user’s expression, a user’s celebrity or follower count should not be the sole criterion for receiving additional protection,” the board wrote. “If users included due to their commercial importance frequently post violating content, they should no longer benefit from special protection.”
The full set of recommendations, which is published on the Oversight Board’s blog, calls on Meta to dramatically realign its content moderation priorities for high-profile users. How much of this the company will actually implement remains to be seen, but this whole process certainly looks like a well-oiled machine compared to the policy setting going on over at Twitter these days.
In a blog update Tuesday, Clegg said Meta will review the recommendations and respond to the suggested changes within 90 days.