We’re now 61 days away from the U.S. presidential election, and Facebook is once more ramping up its efforts to level the playing field and attempt to keep its platform from being manipulated to influence how people vote.
CEO Mark Zuckerberg today announced a series of new measures, including the news that it will block new political and issue ads in the final week of the campaign — although campaigns can still run ads to encourage people to vote, and they can still run older political ads. Other announcements today detailed more work to counter misinformation, and stronger rules to counter voter suppression, including misleading references to COVID-19 at the polls.
The news today is significant not just because it’s a sign of how Facebook continues to work on more proactive measures around the election, but that it is definitely past the point of trying to present itself as an innocent bystander to forces that would have been in play even if Facebook didn’t exist.
“This election is not going to be business as usual,” he wrote in the post. “We all have a responsibility to protect our democracy. That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.”
Other measures will include placing its Voter Information Center — a hub for voting information, with deadlines and guides on how to vote by mail and other related details that it announced in August — at the top of Facebook and Instagram “almost every day until the election.” (Originally, the hub was going to be accessible — and somewhat hidden — in the menu; now it’s being moved into a more prominent slot.)
Zuckerberg said that the political ad blocking is being put in place because Facebook — another admission — doesn’t believe that there would be enough time to contest any new claims that might come in the ads.
But while blocking those last-minute political ads is an important move, it’s not a complete block of all political ads. Facebook said that political ads posted more than a week before the election can still stay up, and targeting for those ads can still be adjusted. In other words, they can essentially be re-run, or run as new campaigns.
Zuck’s explanation is that the older ads have time to be researched. “Those ads will already be published transparently in our Ads Library so anyone, including fact-checkers and journalists, can scrutinize them,” he noted.
But others are already coming out with criticism against the measures, saying they do not go far enough.
“This is a first step, but it doesn’t address the myriad of issues that could actually influence election outcomes,” said Lisa Kaplan, founder and CEO, Alethea Group, a specialist consultant on disinformation, in a statement. “The fact that you can alter advertising takes the teeth out of the proposed policy changes, and political advertising is only one piece of the disinformation puzzle. What’s more pressing is addressing that individuals have been fed a steady misinformation and disinformation diet for months, and one week of allegedly limiting advertising cannot be expected to make much of a difference. While this is a nice gesture — it’s unlikely to make an impact and leaves voters vulnerable to disinformation.”
The company said that its efforts so far have driven 24 million clicks to voter registration sites, but how those translate into actual registrations is not clear. The company has set a goal of helping 4 million people register to vote, and Zuckerberg himself has donated $300 million to organizations working on that effort.
Other efforts announced today include a number of moves to try to combat misinformation — one of the key ways that Facebook has been leveraged in past elections to influence voting.
Specifically, Facebook said it is extending the window beyond 72 hours — its original timescale — where it’s going to try to identify and remove false claims about polling conditions, given that many may try to vote early this time around.
And given how a lot of misinformation is also shared through direct channels off Facebook itself, it’s also going to limit how things can be forwarded on Messenger to stem how content goes viral on there. “You’ll still be able to share information about the election, but we’ll limit the number of chats you can forward a message to at one time,” Zuckerberg noted. This will, of course, cut both ways (those trying to put out accurate information might also get dinged) but ultimately is a direct result of how Facebook has altered forwarding on WhatsApp around elections in other countries, such as India.
One of the other issues that has been highlighted previously has been how the high percentage of people voting by mail might be exploited to the advantage of candidates that take strong early leads in live voting: the worry is that the live results get called as early victories, before other votes are tallied, which could, for example, dissuade people from going to polling stations and voting. Facebook now says that it will be adding labels to candidates and campaigns that try to declare victory before the official calls (but won’t be removing those posts). It’s working with Reuters and the National Election Pool to determine more accurate results, it said.
Another big theme in misinformation has been around COVID-19 and how scare tactics around this are used to dissuade people from voting. Facebook said it will “remove posts with claims that people will get Covid-19 if they take part in voting,” with links to more accurate information. The rule will also include ads with this message.
Misinformation also comes through Facebook by way of sending false details about how polling stations or how voting works, for example not just trying to discourage people from going to polls, but also intentionally giving specific groups of voters the wrong information about how to vote, for example telling them that it’s okay to send in their ballots past the deadline.
All of these policies will work in tandem with how Facebook deals with a completely different threat, coming not from candidates and their campaigns but other actors intent on destabilising how democratic processes work, or simply to influence how they go.
Just this week, Facebook took down a network of 13 accounts and two pages sending out misleading claims about political candidates. The company says that it’s investing more into its security to continue fighting this, but it’s a huge problem, stretching back years now to the previous U.S. presidential election, and apparently not going away anytime soon. While originally the threats were identified as coming from countries like Russia, Zuckerberg now admitted that “We’re increasingly seeing attempts to undermine the legitimacy of our elections from within our own borders.”
It’s going to be a long 61 days….