Meta’s Oversight Board extends its scope to Threads

Meta’s external advisory group, its Oversight Board, announced today that it is expanding its scope to Threads along with Facebook and Instagram to scrutinize Meta’s content moderation decisions.

This means if users on Threads are unsatisfied with Meta’s decision on issues like content or account takedown, they can appeal to the Oversight Board.

“The Board’s expansion to Threads builds on our foundation of helping Meta solve the most difficult questions around content moderation. Having independent accountability early on for a new app such as Threads is vitally important,” Oversight Board co-chair Helle Thorning-Schmidt said in a statement.

In 2018, Mark Zuckerberg formally talked about having an independent oversight board for the first time. In January 2020, Facebook proposed the board’s bylaws and announced the first set of members in May. In October 2020, the board said that it would start reviewing cases. In 2021, the observatory body expanded its scope to review Meta’s decision to keep certain content up.

The Oversight Board has ruled on some important cases over the years. The most notable ruling was criticizing Facebook for “indefinitely” banning former President Donald Trump. While it agreed that Trump broke the platform’s rules, the board said that the guidelines don’t have specifications for an indefinite ban.

Earlier this year, the board called on Meta to reform its “incoherent” rules about fake videos.

Content moderation on Threads

Since Threads launched in July last year, users have questioned its moderation practices multiple times. In October, The Washington Post reported that the platform has been blocking terms like “Covid” and “vaccines” along with “gore,” “nude,” “sex” and “porn.” The same month, Instagram head Adam Mosseri mentioned that this ban was temporary. However, the company hasn’t lifted the ban as of today.

Earlier this month, Threads said that it is running a fact-checking program, and users were seeing labels on some of the posts. However, the company clarified that it is because it matches existing fact checks on Meta’s other properties to posts on Threads. Last year, Meta said it intended to introduce a separate fact-check program for Threads, but the company hasn’t finalized what fact-checkers will be part of.

Mosseri has been adamant about the decision to not recommend political content and “amplify news” on the platform. However, the social network’s newly rolled out trending topics function could feature political content as long as it doesn’t break company policy, Meta said last week.

Oversight Board co-chair Catalina Botero Marino pointed out that with polls coming up in countries like the U.S. and India this year, advances in AI and conflicts across the world, content moderation has become harder.

“With conflicts raging in Europe, the Middle East and elsewhere, billions of people heading to the polls in global elections and growing abuse towards marginalized groups online, there is no time to lose. Advances in artificial intelligence can make content moderation even more challenging,” Marino said.

“The Board is dedicated to finding concrete solutions that protect freedom of expression while reducing harm, and we look forward to setting standards that will improve the online experience for millions of Threads users.”

The process of the oversight board

The Oversight Board hasn’t changed its process for appealing Meta’s decisions on Threads. A user has to appeal to Meta first, and within 15 days of receiving a verdict, they should appeal to the board if they are unhappy with the social platform’s judgment.

The board can take up to 90 days to review the decision from the date of appeal. In the past, the board and Meta have been criticized for the slowness of their responses. However, the organization hasn’t changed any processes for Threads at the moment.

Notably, the board can issue both recommendations and decisions. While recommendations are not binding, Meta is obliged to follow the board’s rulings.