Facebook plans customizable filters for nudity and violence

Facebook wants to give you the power to define what is and isn’t objectionable, and influence the local defaults of those who don’t choose voluntarily. You’ll eventually be able to select how much nudity, violence, graphic content and profanity you’re comfortable seeing.

Mark Zuckerberg revealed this massive shift in Facebook’s Community Standards policy today in his 5,000-word humanitarian manifesto, which you can read our highlights and analysis of here.

Currently, Facebook relies on a one-size-fits-most set of standards about what’s allowed on the network. The only exception is that it abides by local censorship laws. But that’s led to trouble for Facebook, as newsworthy historical photos including nudity and citizen journalism accounts of police violence have been wrongly removed, then restored after media backlash or executive review.

Zuckerberg explains the forthcoming policy, writing:

“The idea is to give everyone in the community options for how they would like to set the content policy for themselves. Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings. We will periodically ask you these questions to increase participation and so you don’t need to dig around to find them. For those who don’t make a decision, the default will be whatever the majority of people in your region selected, like a referendum. Of course you will always be free to update your personal settings anytime.

With a broader range of controls, content will only be taken down if it is more objectionable than the most permissive options allow.”

This approach allows Facebook to give vocal, engaged users choice, while establishing reasonable localized norms, without ever forcing specific policies on anyone or requiring all users to configure complicated settings.

To classify potentially objectionable content Facebook will lean more heavily on artificial intelligence, which is already delivering 30 percent of all content flags to its human reviewers. Over time, Zuckerberg hopes Facebook’s AI will learn to make nuanced distinctions, such as between terrorist propaganda and a news report about a terrorist attack.

There are still plenty of questions about how this system will work. For example, what happens to teens? Do they get strict defaults or the same control, and do parents have license to select their kids’ settings? And we don’t know when this will launch, though Zuckerberg implied it would all take time.

This new system of governance could make Facebook’s policies feel less overt, as they should align with local norms. It might also be a boon to certain content creators, such as photographers or painters who make nude art, videographers who capture action or war or unfiltered pundits with niche views.

Personalized and localized site governance might prove more democratic than treating Facebook as one giant country. Its 2012 experiment with allowing people to vote on policies failed and was scrapped because it required 30 percent of a users to vote on long, complicated documents of changes for their majority decision to be binding. The final vote would have needed 300 million votes to be binding, but received just 619,000. Now users who don’t “vote” on their settings receive the local defaults, “like a referendum” in a U.S. state.

photo-mark-zuckerberg-talking-about-his-letter-to-the-community-at-facebooks-internal-quarterly-all-company-meeting

Zuckerberg also outlined several other product development plans. Facebook hopes to add more suggestions for local Groups to tie users deeper into their communities. Facebook will also give Group leaders more tools, akin to what Facebook provides Page owners. Zuckerberg didn’t provide specifics, but those features might include analytics about what content is engaging, the ability to set more types of admins and moderators or the option to add outside app functionality.

As far as safety and information, Facebook wants to expand AI detection of bullying or self-harm, and potentially allow people to report mental health issues, disease or crime. And to fight polarization and sensationalism, not just objectively fake news, it wants to present users with a range of sources across the political spectrum about a given topic. That could potentially come through showing Related Articles on links that draw on sources from other parts of the spectrum.

The central theme of these changes is Facebook empowering users to define their own experience. It wants to see the world move toward a supportive, safe, informed, civically engaged and inclusive global community. But it still sees itself as just a tool, with the direction of progress defined by those who wield it.