Meta’s planned Twitter killer, Threads, isn’t yet publicly available but it already looks like a privacy nightmare.
Information provided about the app’s privacy via mandatory disclosures required on iOS shows the app may collect highly sensitive information about users in order to profile their digital activity — including health and financial data, precise location, browsing history, contacts, search history and other sensitive information.
Given that Meta, the developer behind the app, the company formerly known as Facebook, makes its money from tracking and profiling web users to sell their attention via its behavioral advertising microtargeting tools this is hardly surprising. But it does raise questions over whether Threads will be able to launch in the European Union where the legal basis Meta had claimed for processing Facebook users’ personal data (performance of a contract) was found unlawful at the start of this year.
Meta has since switched to a claim of legitimate interest for this data-for-ads processing. But, earlier this week, the bloc’s top court piled more regional woe on Meta via a judgement on a German case referral where the Court said this legal basis is not appropriate for running Meta’s behavioral ads either and consent needs to be sought. Under current EU law, sensitive information such as health data also requires an even higher standard of explicit consent to be legally processed in order to be compliant with the General Data Protection Regulation. So Meta would need to ask and obtain specific permission for processing sensitive data like health into.
Additionally, incoming EU regulations ban use of sensitive data for ads entirely and may require explicit consent for tech giants to combine data for ad profiling (see: the Digital Services Act and Digital Markets Act). So there’s even more regional legal uncertainty looming on the horizon for Meta’s people farming business. (Designated gatekeepers must be compliant with the DMA by next spring; while so-called very large online platforms need to meet obligations under the DSA by August 25.)
Currently, the adtech giant does not even offer users a general, up-front choice to deny its tracking and profiling, let alone explicitly ask if it can share data on your health conditions so advertisers can try to sell you diet pills or whatever. And with even harder limits on surveillance ads coming down the pipe in the EU an app that proposes to track everything to maximize its appeal to advertisers will be a tough sell to regional regulators.
Plus — as if that wasn’t enough — Meta was recently hit with an order to stop sending EU users data to the US for processing and fined almost $1.3BN for breaching the GDPR’s requirements on data exports. That order is specific to Facebook but, in principle, the same requirement could be applied to other Meta services that don’t adequately protect Europeans’ data over the pond (such as by using zero knowledge architecture end-to-end encryption). And, clearly, Threads isn’t going to offer users that kind of privacy.
Bringing Meta’s surveillance ads business into compliance with EU law is going to require a sea-change in how it operates — one which does not appear to be its plan with Threads, given it’s presenting more of the same data-grabbing attention farming that’s gained Mark Zuckerberg’s empire such a toxic rep it had to undergo an expensive corporate rebrand to Meta in recent years.
Whether the rebranding has worked to detoxify Meta’s corporate image looks debatable given it’s opting to attach Threads to Instagram’s brand, rather than explicitly calling it a Meta app (the developer listed on the App Store is “Instagram Inc” and the text description describes the app as “Instagram’s text-based conversation app”). Albeit that choice might be more to do with Meta seeing it as the best strategy for quickly building up a Threads user-base if it can push Instagram’s large and engaged community to insta-adopt what it’s framing as a sister “text” app so the latter can hit the ground running.
One thing is clear: Threads won’t be doing any running in the EU yet. And possibly never. At least not unless Meta radically reforms its approach to user choice over tracking.
Yesterday the Irish Independent reported the app won’t launch in the EU, quoting Meta’s lead regional data protection supervisor, the Irish DPC, saying it had been in contact with Meta about the service and that it wouldn’t launch “at this point”.
While today the Guardian — citing sources inside Meta — has reported the company delayed an EU launch of Threads over legal uncertainty around data use attached to the aforementioned DMA’s limits on sharing user data across different platforms.
A Meta spokesman did not respond to our questions about whether it plans to launch Threads in the EU or not.
But the DPC clarified to TechCrunch that it has not prevented Meta launching Threads, based on its role enforcing compliance with the GDPR, saying the company has “no plans to launch in the EU yet”. So it seems there has not been any active regulatory intervention to block a launch at this stage. Rather Meta appears concerned over the legal risk it could wrack up if it goes ahead with a launch when it’s set to be subject to the DMA in a few months’ time. (Earlier this week the company informed the EU it believes the incoming ex ante antitrust regime does apply to its business — but compliance isn’t required until six months after the official EU gatekeeper designations).
The new regulation will be enforced centrally by the European Commission, rather than by Member State level authorities such as the Irish DPC. So expectations are for a change of gear in the bloc towards enforcement on digital giants — and that paradigm shift also cranks up the legal uncertainty for Meta inside the EU.
Notably Threads is due to launch in the U.K. on Thursday — where there’s a different regulatory picture since the market no longer falls under EU law following the Brexit referendum vote to leave the bloc.
The U.K.’s current data protection regime is still derived from the GDPR so, technically speaking, the same legal requirements around processing personal data do also apply there. However the country’s data protection watchdog, the ICO, has been infamously inactive on systemic breaches of the surveillance advertising industry. So Meta may be comfortable with the level of legal risk its business faces in Brexit Britain. And while the U.K. government recently revived a shelved a plan to enact its own ex ante antitrust reform targeted at digital giants, it’s likely years out of having comparable legislation to the EU’s DMA on its own statute books.
The U.K. government has also signalled a plan to water down domestic data protection standards, under a post-Brexit data reform bill, which also looks set to erode the independence of the ICO and may make the watchdog even more toothless than it is already when it comes to tackling data protection abuses.
In the EU, meanwhile, Meta was fined over $410 million in January over its lack of a valid legal basis under the GDPR to run behavioral ads on Facebook and Instagram — which is just the latest in a string of chunky penalties it’s been hit with for breaching the GDPR. Whereas the last time the ICO fined Meta it was in the wake of the Cambridge Analytics scandal when the company was still called Facebook.
Under the DMA, centrally enforced penalties can scale up to 10% of global annual turnover — which is considerably higher than the theoretical maximum DPAs can sanction data controllers for breaches of the GDPR (which tops out at just 4%).
In the event, fines on tech giants found to have breached the EU’s data protection regulation have remained a fraction of the maximum, including in the case of Meta.