Europe names 19 platforms that must report algorithmic risks under DSA

The European Union has confirmed the names of over a dozen platforms that will face the strictest level of regulation under its recently rebooted and expanded e-commerce rules, aka the Digital Services Act (DSA).

The list is a mix of familiar digital services, from social media apps to search engines and app stores — with no real surprises. The lion’s share of regulated platforms are developed by U.S. based for-profit firms, with a few international (mostly European) players in the mix, and one non-profit (the online encyclopedia Wikipedia).

The full list of 19 platforms which are being designated in this first wave is as follows:

  • Alibaba AliExpress
  • Amazon Store
  • Apple AppStore
  • Bing
  • Booking.com
  • Facebook
  • Google Play
  • Google Maps
  • Google Search
  • Google Shopping
  • Instagram
  • LinkedIn
  • Pinterest
  • Snapchat
  • TikTok
  • Twitter
  • Wikipedia
  • YouTube
  • Zalando

The listed platforms that have been confirmed meet the criteria as either very large online platforms (VLOPs) or very large online search engines (VLOSEs), in the regulation’s jargon. (Actually, just two VLOSE are being designated at this point (Bing and Google Search); the other 17 are all VLOPs.)

A VLOP or VLOSE designation carries extra requirements to assess and mitigate systemic risks attached to the use of algorithms and AIs, meaning the platforms must be proactive about analyzing and reporting potential issues related to the operation of technologies like content ranking tools and recommender systems.

The EU’s idea is to use mandatory algorithmic transparency requirements to drive accountability — meaning regulated platforms won’t be able to turn a blind eye to AI-amplified harms as the law also requires they put in place “reasonable, proportionate and effective mitigation measures” for identified risks, with their reporting and mitigation plans subject to independent audit and oversight by the European Commission (with the support of a newly opened European Center for Algorithmic Transparency). While penalties for non-compliance can scale up to 6% of global annual turnover.

Risks regulated VLOPs/VLOSEs must consider include AI-driven or algorithmic harms to fundamental rights like freedom of expression; civic discourse and electoral processes; public security and public health; gender-based violence; child safety; and mental health.

The DSA also imposes some limits on microtargeted advertising — with a prohibition on processing sensitive data for ad targeting or the use of children’s information. While marketplaces have some requirements to check the identity of sellers.

Users of VLOPS/VLOSE must also be provided with clear information on why they are recommended certain content; and have the right to opt-out from recommendation systems based on profiling — applying pressure to ad-funded business models, such as Meta’s, which hinge on tracking and profiling web users.

Elsewhere, the regulation demands platforms have clear T&Cs — and enforce them diligently and non-arbitrarily.

The principle criteria for being regulated as a VLOP/VLOSE is to have more than 45 million monthly active users in the EU. Platforms were required to report their numbers to the EU back in February so the Commission could begin the process of designation.

To say that some of the named platforms may be rather unprepared for this special regime of DSA compliance is an understatement: The Commission already warned Twitter last November of the “huge work” it faces to comply with the bloc’s rules.

Since then Twitter’s erratic owner, Elon Musk, has only continued to channel abject chaos — such as, for example, dismantling the platform’s legacy verification system which had granted Blue Checks to notable accounts and replacing that long-standing authenticity signal with a scammers’ paradise since anyone willing to pay Twitter $8 a month can now buy a faux Blue Check which mimics the look of the old system but does not involve any meaningful verification check.

Musk has also demonstrated a penchant for making arbitrary and even spiteful decisions — banning users at whim and even removing some legacy verifications sooner than others (apparently just for the lols) while also paying for a handful of celebrities to retain their Blue Checks, creating the (false) impression that they have paid Musk for the badge — none of which seem likely to sit well with DSA requirements for plain and fair dealing.

Another requirement that VLOPs have a mechanism for users to flag illegal content and act on notifications expeditiously similarly looks tricky for Twitter under Musk — given it’s already facing enforcement by the German government for failing to take down illegal hate speech under the country’s illegal hate speech regime.

The pan-EU DSA also requires VLOPs to support researchers by, in the first instance, providing access to publicly available data — and later on (via delegated act) establishing a special mechanism for vetted researchers so they can conduct research into systemic risks in the EU. And, yet again, Musk has raced in the opposite direction — drastically reducing third party researchers’ access to platform data by cranking up the cost of using Twitter’s APIs.

It’s almost as if the regulation was drafted with Musk in mind. But of course its drafting long predates the billionaire shit-poster’s tenure at Twitter.

While the DSA certainly has its critics, the prospect of a broader backlash against the EU’s approach to updating its digital rulebook seems rather less likely than it might have been had Musk not taken his accelerated wrecking ball to Twitter over the past half year or so.

In other words, suddenly the prospect of responsible, non-arbitrary management being a requirement that’s imposed on major platforms start to look kind of necessary/prescient.

There is still a grace period before these larger platforms are expected to be compliant with the DSA — but it’s now just a matter of months: Firms that have been designated VLOPs/VLOSEs have four months to comply with obligations under the DSA — which includes publishing their first risk assessments — so the 19 listed services (and dozen companies plus one non-profit) face a busy summer ahead if they’re to meet the August 25 compliance deadline without breaking a sweat.

Twitter certainly has its work cut out if it’s to avoid a world of regulatory pain falling on it in the EU — coupled with the existential risk, were Musk were to opt to keep flouting the rules, of a regional shutdown being imposed. (Albeit, no political body is going to want to be the one ordering a Twitter shutdown so there would likely be an extended showdown of financial enforcements prior to such a terminal step.)

VLOPs/VLOSEs must also comply with the general provisions of the DSA, which will apply more broadly to digital services and smaller platforms from early next year, placing obligations on how they must respond to reports of illegal content, as well as in areas like notice and action mechanisms and complaint handling, in addition to transparency obligations and plenty more.

More VLOPs incoming?

While the first wave of VLOPs/VLOSEs don’t yet number 20 more could be coming: The Commission has suggested (via Politico) there could be a second batch of VLOPs/VLOSE designations in the next few weeks as it said it will be conducting checks on some other platforms which told it they do not meet the criteria in order to confirm whether that’s the case.

One type of large digital service that’s currently missing from the list of designations is adult content.

Asked about this during a briefing with journalists, a Commission official confirmed it is engaging with a number of adult content services but declined to specify which platforms it’s seeking more information from at this stage as regards their self declared user numbers.

They added the idea is to better understand the methodologies being used for calculating regional usage — saying adult sites that have declared figures to the Commission are largely claiming usage is below the 45M monthly actives threshold.

The EU’s executive declined to provide details of any other platforms it’s looking at for possible designation. However, earlier today, Politico reported remarks by internal market commissioner Thierry Breton which suggests Pornhub is one adult platform that’s being looked at in Brussels as a possible addition to the VLOPs list, along with Airbnb, Spotify and Telegram.

While bounded messaging apps are not obviously in scope of the platform component of the regulation — hence (presumably) the lack of a designation for Meta-owned WhatsApp — a Commission official said the DSA recognizes that messenger services may be architected in such a way that they could be considered platforms, i.e. if content can reach a potentially unlimited number of users.

Telegram’s app, for instance, includes a social media-style channels feature which lets users broadcast content to large numbers of subscribers. However the company has so far reported a lower level of regional users to the EU than the VLOP threshold (~33 million). And a Commission official told TechCrunch it is in the process of following up to understand how the platform arrived at this figure.

While it remains to be seen how many extra VLOPs may be named in the coming weeks, where designations have been made the Commission said it will be engaging with all the platforms to assess their readiness for compliance in the coming months.

It added that it will be paying particular attention to child safety by asking companies to prioritize sharing information about how they are preparing to ensure they are taking steps to protection minors online.

Here it also remains to be seen what that might mean in practical terms. But it could, for example, encourage designated platforms to apply tougher age assurance measures — or even adopt hard-stop age verification technologies — in a bid to shrink their compliance risks in a stated priority area for EU regulators.

In additional remarks, the Commission confirmed that no platforms are falling into a non-responsive/non-cooperative category as yet — with official saying all have been engaging with its overtures so far. But it will be interesting to see whether early cooperation keeps up (or not). And, indeed, whether the Commission gets the level of required detail for it to be able to smoothly and effectively carry out the intended oversight of platform power.

There are some early signs that self reporting will throw curve balls for regulators: E-commerce marketplace Zalando, for instance, reported two sets of usage numbers to the Commission — one of which was below the threshold for being designated and which it apparently argued for (with the Commission taking the opposite view and confirming the e-commerce marketplace as a VLOP).

Commission officials also noted that Apple recently updated its self reported numbers in a way that breaks down usage for its App Store per device. Again, though, the EU’s executive is treating it a single platform for the purposes of the DSA — arguing that Apple’s content moderation policies are essentially the same regardless of where its App Store is being accessed.

During the briefing, the Commission reiterated another detail we reported on earlier: That OpenAI’s ChatGPT is not itself currently considered a platform, under the DSA definition — and will be more directly regulated under the bloc’s incoming AI Act (so not for likely a couple of years) — however the EU is stipulating that generative AI technologies could end up being regulated indirectly under the DSA. Such as if a regulated search engine (like Bing; which is now confirmed as a VLOSE) incorporates the tech into its service.

That would then trigger requirements on Bing’s owner, Microsoft, to ensure it is robustly assessing systemic risks of integrating generative AI — such as the embedded tendency for these chatbots to hallucinate when they lack information to provide a correct answer, with the risk they end up amplifying disinformation in search results say. So while the regulation was clearly not drafted with the latest AI hype in mind, it could apply some guardrails on major platforms’ use of the tech from as soon as this fall.

What about when we might seen the first DSA enforcements landing on rule-breaking VLOPs/VLOSEs? (And Twitter would be the obvious early bet for inviting a world of pain.)

There’s no official word form the Commission on when (or where) orders and/or penalties might start to land, beyond the technical compliance deadline of late August/September. Although officials point out that the designation step itself unlocks what they couch as wide-ranging investigatory powers — so EU regulators are sending an early warning shot that they already armed with tools to start digging into issues of particular concern for the 19 listed platforms.

The bloc is also keen to point out that certain algorithmic transparency requirements will, ultimately, apply to all digital services that get regulated under the DSA — not just the larger subset of VLOPs/VLOSE — from early next year. Although oversight on those smaller services will fall to Member State level bodies, rather than the Commission itself.

Asked if it’s getting much international inbound over the DSA, as regulators in other jurisdictions consider how to tackle platform power, a Commission official said there has already been plentiful interest — so much so they said a dedicated team has been set up to respond to international enquiries.

At the same time, the Commission said it’s keen to avoid what it referred to as “zombie DSAs” popping up elsewhere in countries that lack a foundational fundamental and/or human rights framework to build on.

This report was updated with additional detail following a Commission technical briefing on the VLOPs/VLOSE designations

We also updated with a correction after the Commission initially misstated Amazon Marketplace as designated; in fact it’s Amazon Store that’s been named a VLOP