Europe seizes on social media’s purging of Trump to bang the drum for regulation

Big tech’s decision to pull the plug on president Donald Trump’s presence on their platforms, following his supporters’ attack on the US capital last week, has been seized on in Europe as proof — if proof were needed — that laws have not kept pace with tech market power and platform giants must face consequences over the content they amplify and monetize.

Writing in Politico, the European Commission’s internal market commissioner, Thierry Breton, dubs the 6/1 strike at the heart of the US political establishment as social media’s ‘9/11’ moment — aka, the day the whole world woke up to the real-world impact of unchecked online hate and lies.

Since then Trump has been booted from a number of digital services, and the conservative social media app Parler has also been ejected from the App Store and Google Play over a failure to moderate violent threats, after Trump supporters flocked to the app in the wake of Facebook’s and Twitter’s crackdown.

At the time of writing, Parler is also poised to be booted by its hosting provider AWS, while Stripe has reportedly pulled the plug on Trump’s ability to use its payment tools to fleece supporters. (Although when this reporter asked in November whether Trump was breaching its TOC by using its payment tools for his ‘election defense fund’ Stripe ignored TechCrunch’s emails…)

“If there was anyone out there who still doubted that online platforms have become systemic actors in our societies and democracies, last week’s events on Capitol Hill is their answer. What happens online doesn’t just stay online: It has — and even exacerbates — consequences ‘in real life’ too,” Breton writes.

“Last week’s insurrection marked the culminating point of years of hate speech, incitement to violence, disinformation and destabilization strategies that were allowed to spread without restraint over well-known social networks. The unrest in Washington is proof that a powerful yet unregulated digital space — reminiscent of the Wild West — has a profound impact on the very foundations of our modern democracies.”

The Europe Commission proposed a major update to the rules for digital services and platform giants in December, when it laid out the Digital Services Act (DSA) and Digital Markets Act — saying it’s time to level the regulatory playing field by ensuring content and activity that’s illegal offline is similarly sanctioned online.

The Commission’s proposal also seeks to address the market power of tech giants with proposals for additional oversight and extra rules for the largest platforms that have the potential to cause the greatest societal harm.

Unsurprisingly, then, Breton has seized on the chaotic scenes in Washington to push this already-formed tech policy plan — with his eye on a domestic audience of European governments and elected members of the European Parliament whose support is needed to pass the legislation and reboot the region’s digital rules.

“The fact that a CEO can pull the plug on POTUS’s loudspeaker without any checks and balances is perplexing. It is not only confirmation of the power of these platforms, but it also displays deep weaknesses in the way our society is organized in the digital space,” he warns.

“These last few days have made it more obvious than ever that we cannot just stand by idly and rely on these platforms’ good will or artful interpretation of the law. We need to set the rules of the game and organize the digital space with clear rights, obligations and safeguards. We need to restore trust in the digital space. It is a matter of survival for our democracies in the 21st century.”

The DSA will force social media to clean up its act on content and avoid the risk of arbitrary decision-making by giving platforms “clear obligations and responsibilities to comply with these laws, granting public authorities more enforcement powers and ensuring that all users’ fundamental rights are safeguarded”, Breton goes on to argue.

The commissioner also addresses US lawmakers directly — calling for Europe and the US to join forces on Internet regulation and engage in talks aimed at establishing what he describes as “globally coherent principles”, suggesting the DSA as a starting point for discussions. So he’s not wasting the opportunity of #MAGA-induced chaos to push a geopolitical agenda for EU tech policy too.

Last month the Commission signalled a desire to work with the incoming Biden administration on a common approach to tech governance, saying it hoped US counterparts would work with to shape global standards for technologies like AI and to force big tech to be more responsible, among other areas. And recent events in Washington do seem to be playing into that hand — although it remains to be seen how the incoming Biden administration will approach regulating big tech.

“The DSA, which has been carefully designed to answer all of the above considerations at the level of our Continent, can help pave the way for a new global approach to online platforms — one that serves the general interest of our societies. By setting a standard and clarifying the rules, it has the potential to become a paramount democratic reform serving generations to come,” Breton concludes.

Twitter’s decision to (finally) pull the plug on Trump also caught the eye of UK minister Matt Hancock, the former  secretary of state for the digital brief (now the health secretary). Speaking to the BBC this weekend, he suggested the unilateral decision “raises questions” about how big tech is regulated that would result in “consequences”.

“The scenes, clearly encouraged by President Trump — the scenes at the Capitol — were terrible — and I was very sad to see that because American democracy is such a proud thing. But there’s something else that has changed, which is that social media platforms are making editorial decisions now. That’s clear because they’re choosing who should and shouldn’t have a voice on their platform,” he told the Andrew Marr program.

The BBC reports that Hancock also told Sky News Twitter’s ban on Trump means social media platforms are taking editorial decisions — which he said “raises questions about their editorial judgements and the way that they’re regulated”.

Hancock’s remarks are noteworthy because back in 2018, during his time as digital minister, he said the government would legislate to introduce a statutory code of conduct on social media platforms forcing them to act against online abuse.

More than two years’ later, the UK’s safety-focused plan to regulate the Internet is still yet to be put before parliament — but late last year ministers committed to introducing an Online Safety Bill this year. 

Under the plan, the UK’s media regulator, Ofcom, will gain new powers to oversee tech platforms — including the ability to levy fines for non-compliance with a safety-focused duty of care of up to 10% of a company’s annual turnover.

The proposal covers a wide range of digital services, not just social media. Larger platforms are also slated to have the greatest responsibility for moderating content and activity. And — at least in its current form — the proposed law is intended to apply not just to content that’s illegal under UK law but also the fuzzier category of ‘harmful’ content.

That’s something the European Commission proposal has steered clear of — with more subjective issues like disinformation set to be tackled via a beefed-up (but still voluntary) code of practice, instead of being baked into digital services legislation. So online speech looks set to be one area of looming regulatory divergence in Europe, with the UK now outside the bloc.

Last year, the government said larger social media platforms — such as Facebook, TikTok, Instagram and Twitter — are likely to “need to assess the risk of legal content or activity on their services with ‘a reasonably foreseeable risk of causing significant physical or psychological harm to adults’” under the forthcoming Online Safety Bill.

“They will then need to make clear what type of ‘legal but harmful’ content is acceptable on their platforms in their terms and conditions and enforce this transparently and consistently,” it added, suggesting the UK will in fact legislate to force platforms to make ‘editorial’ decisions.

The consequences Hancock thus suggests are coming for tech platforms look rather akin to the ‘editorial’ decisions they have been making in recent days.

Albeit, the uncomfortable difference he seems to have been articulating is between tech platforms that have massive unilateral power to silence the US president at a stroke and at a point of their own choosing vs tech platforms being made to comply with a pre-defined rules-based order set by legislators and regulators.

In additional high level responses on Monday, the German chancellor, Angela Merkel, described Twitter’s shuttering of Trump’s account as “problematic”, per Bloomberg, which quotes her chief spokesman, Steffen Seibert, making the comment at a regular news conference in Berlin. Rights like the freedom of speech “can be interfered with, but by law and within the framework defined by the legislature — not according to a corporate decision”, the spokesman added.

Similar views were espoused by the French finance minister, Bruno Le Maire, who said it should be for states and the justice system to regulate big tech, not for the “digital oligarchy” to regulate itself.

He also described regulation of big tech as “necessary”.

This report was updated with additional remarks by the German chancellor and France’s finance minister