Twitter expands safety policy, bans posting images of people without their consent

Twitter updated its private information safety policy this morning to ban sharing images or videos of private individuals without their consent. The platform already banned users from sharing others’ personal information without permission, like their address or location, identity documents, non-public contact information, financial information or medical data. But this update makes these anti-harassment and anti-doxxing policies more robust.

This doesn’t mean that Twitter will require consent from all individuals in a photo or video before its posted. But if a person depicted wants the media taken down, Twitter will take it down.

“When we are notified by individuals depicted, or by an authorized representative, that they did not consent to having their private image or video shared, we will remove it,” Twitter wrote in its update. “This policy is not applicable to media featuring public figures or individuals when media and accompanying Tweet text are shared in the public interest or add value to public discourse.”

But in the case of public figures, Twitter clarified that it may remove content intended for harassment in line with its existing policies against abusive behavior, which also ban sharing non-consensual nude images. The company also says that when deciding whether to remove content about public figures, it will assess whether this information is already available in other public media, like TV and newspapers.

Still, many Twitter users took to the platform to voice their concerns about this new policy. They wondered if this meant you couldn’t post a photo of a crowd at a football game without getting consent from every individual, for example, and worried that this policy could be used to silence users. So Twitter Safety added to the announcement thread hours later to address concerns.

“Let’s unpack what this means,” the account wrote. “This policy update will help curb the misuse of media to harass, intimidate, and reveal the identities of private individuals, which disproportionately impacts women, activists, dissidents, and members of minority communities.”

Twitter went on to explain that images and videos that show people participating in large events wouldn’t violate the policy. When media is reported by someone in the image, or an authorized representative of that person, Twitter will consider if the tweet “adds value to the public discourse” before making a decision about deleting the post. Still, critics responded with their concerns about how Twitter decides what content adds value.

Context matters. Our existing private information policy includes many exceptions in order to enable robust reporting on newsworthy events and conversations that are in the public interest,” the app wrote.

Anyway, in case you didn’t use the internet yesterday, Jack stepped down as CEO of Twitter. But there’s no indication that this policy change is related to his departure.

Update 11/30/21, 1:55 PM EST with follow-up tweets from Twitter Safety