The European Union’s lead data protection supervisor has recommended that a ban on targeted advertising based on tracking internet users’ digital activity be included in a major reform of digital services rules which aims to increase operators’ accountability, among other key goals.
The European Data Protection Supervisor (EDPS), Wojciech Wiewiorówski, made the call for a ban on surveillance-based targeted ads in reference to the Commission’s Digital Services Act (DSA) — following a request for consultation from EU lawmakers.
The DSA legislative proposal was introduced in December, alongside the Digital Markets Act (DMA) — kicking off the EU’s (often lengthy) co-legislative process, which involves debate and negotiations in the European Parliament and Council on amendments before any final text can be agreed for approval. This means battle lines are being drawn to try to influence the final shape of the biggest overhaul to pan-EU digital rules for decades — with everything to play for.
The intervention by Europe’s lead data protection supervisor calling for a ban on targeted ads is a powerful pre-emptive push against attempts to water down legislative protections for consumer interests.
The Commission had not gone so far in its proposal — but big tech lobbyists are certainly pushing in the opposite direction so the EDPS taking a strong line here looks important.
In his opinion on the DSA the EDPS writes that “additional safeguards” are needed to supplement risk mitigation measures proposed by the Commission — arguing that “certain activities in the context of online platforms present increasing risks not only for the rights of individuals, but for society as a whole”.
Online advertising, recommender systems and content moderation are the areas the EDPS is particularly concerned about.
“Given the multitude of risks associated with online targeted advertising, the EDPS urges the co-legislators to consider additional rules going beyond transparency,” he goes on. “Such measures should include a phase-out leading to a prohibition of targeted advertising on the basis of pervasive tracking, as well as restrictions in relation to the categories of data that can be processed for targeting purposes and the categories of data that may be disclosed to advertisers or third parties to enable or facilitate targeted advertising.”
It’s the latest regional salvo aimed at mass-surveillance-based targeted ads after the European Parliament called for tighter rules back in October — when it suggested EU lawmakers should consider a phased-in ban.
Again, though, the EDPS is going a bit further here in actually calling for one. (Facebook’s Nick Clegg will be clutching his pearls.)
More recently, the CEO of European publishing giant Axel Springer, a longtime co-conspirator of adtech interests, went public with a (rather protectionist-flavored) rant about U.S.-based data-mining tech platforms turning citizens into “the marionettes of capitalist monopolies” — calling for EU lawmakers to extend regional privacy rules by prohibiting platforms from storing personal data and using it for commercial gain at all.
Apple CEO Tim Cook also took to the virtual stage of a (usually) Brussels-based conference last month to urge Europe to double down on enforcement of its flagship General Data Protection Regulation (GDPR).
In the speech, Cook warned that the adtech “data complex” is fuelling a social catastrophe by driving the spread of disinformation as it works to profit off of mass manipulation. He went on to urge lawmakers on both sides of the pond to “send a universal, humanistic response to those who claim a right to users’ private information about what should not and will not be tolerated”. So it’s not just European companies (and institutions) calling for pro-privacy reform of adtech.
The iPhone maker is preparing to introduce stricter limits on tracking on its smartphones by making apps ask users for permission to track, instead of just grabbing their data — a move that’s naturally raised the hackles of the adtech sector, which relies on mass surveillance to power “relevant” ads.
Hence the adtech industry has resorted to crying “antitrust” as a tactic to push competition regulators to block platform-level moves against its consentless surveillance. And on that front it’s notable that the EDPS’ opinion on the DMA, which proposes extra rules for intermediating platforms with the most market power, reiterates the vital links between competition, consumer protection and data protection law — saying these three are “inextricably linked policy areas in the context of the online platform economy”; and that there “should be a relationship of complementarity, not a relationship where one area replaces or enters into friction with another”.
Wiewiorówski also takes aim at recommender systems in his DSA opinion — saying these should not be based on profiling by default to ensure compliance with regional data protection rules (where privacy by design and default is supposed to be the legal default).
Here too he calls for additional measures to beef up the Commission’s legislative proposal — with the aim of “further promot[ing] transparency and user control”.
This is necessary because such system have “significant impact”, the EDPS argues.
The role of content recommendation engines in driving internet users toward hateful and extremist points of view has long been a subject of public scrutiny. Back in 2017, for example, U.K. parliamentarians grilled a number of tech companies on the topic — raising concerns that AI-driven tools, engineered to maximize platform profit by increasing user engagement, risked automating radicalization, causing damage not just to the individuals who become hooked on hateful views the algorithms feeds them but cascading knock-on harms for all of us as societal cohesion is eaten away in the name of keeping the eyeballs busy.
Yet years on, little information is available on how such algorithmic recommender systems work because the private companies that operate and profit off these AIs shield the workings as proprietary business secrets.
The Commission’s DSA proposal takes aim at this sort of secrecy as a bar to accountability — with its push for transparency obligations. The proposed obligations (in the initial draft) include requirements for platforms to provide “meaningful” criteria used to target ads; and explain the “main parameters” of their recommender algorithms; as well as requirements to foreground user controls (including at least one “nonprofiling” option).
However, the EDPS wants regional lawmakers to go further in the service of protecting individuals from exploitation (and society as a whole from the toxic byproducts that flow from an industry based on harvesting personal data to manipulate people).
On content moderation, Wiewiorówski’s opinion stresses that this should “take place in accordance with the rule of law”. Though the Commission draft has favored leaving it with platforms to interpret the law.
“Given the already endemic monitoring of individuals’ behaviour, particularly in the context of online platforms, the DSA should delineate when efforts to combat ‘illegal content’ legitimise the use of automated means to detect, identify and address illegal content,” he writes, in what looks like a tacit recognition of recent CJEU jurisprudence in this area.
“Profiling for purposes of content moderation should be prohibited unless the provider can demonstrate that such measures are strictly necessary to address the systemic risks explicitly identified by the DSA,” he adds.
The EDPS has also suggested minimum interoperability requirements for very large platforms, and for those designated as “gatekeepers” (under the DMA), and urges lawmakers to work to promote the development of technical standards to help with this at the European level.
On the DMA, he also urges amendments to ensure the proposal “complements the GDPR effectively”, as he puts it, calling for “increasing protection for the fundamental rights and freedoms of the persons concerned, and avoiding frictions with current data protection rules”.
Among the EDPS’ specific recommendations are: That the DMA makes it clear that gatekeeper platforms must provide users with easier and more accessible consent management; clarification to the scope of data portability envisaged in the draft; and rewording of a provision that requires gatekeepers to provide other businesses with access to aggregated user data — again with an eye on ensuring “full consistency with the GDPR”.
The opinion also raises the issue of the need for “effective anonymisation” — with the EDPS calling for “re-identification tests when sharing query, click and view data in relation to free and paid search generated by end users on online search engines of the gatekeeper”.
ePrivacy reform emerges from stasis
Wiewiorówski’s contributions to shaping incoming platform regulations come on the same day that the European Council has finally reached agreement on its negotiating position for a long-delayed EU reform effort around existing ePrivacy rules.
In a press release announcing the development, the Commission writes that Member States agreed on a negotiating mandate for revised rules on the protection of privacy and confidentiality in the use of electronic communications services.
“These updated ‘ePrivacy’ rules will define cases in which service providers are allowed to process electronic communications data or have access to data stored on end-users’ devices,” it writes, adding: “Today’s agreement allows the Portuguese presidency to start talks with the European Parliament on the final text.”
Reform of the ePrivacy directive has been stalled for years as conflicting interests locked horns — putting paid to the (prior) Commission’s hopes that the whole effort could be done and dusted in 2018. (The original ePrivacy reform proposal came out in January 2017; four years later the Council has finally settled on its arguing mandate.)
The fact that the GDPR was passed first appears to have upped the stakes for data-hungry ePrivacy lobbyists — in both the adtech and telco space (the latter having a keen interest in removing existing regulatory barriers on comms data in order that it can exploit the vast troves of user data which internet giants running rival messaging and VoIP services have long been able to).
There’s a concerted effort to try to use ePrivacy to undo consumer protections baked into GDPR — including attempts to water down protections provided for sensitive personal data. So the stage is set for an ugly rights battle as negotiations kick off with the European Parliament.
Metadata and cookie consent rules are also bound up with ePrivacy, so there’s all sorts of messy and contested issues on the table here.
Digital rights advocacy group Access Now summed up the ePrivacy development by slamming the Council for “hugely” missing the mark.
“The reform is supposed to strengthen privacy rights in the EU [but] States poked so many holes into the proposal that it now looks like French Gruyère,” said Estelle Massé, senior policy analyst at Access Now, in a statement. “The text adopted today is below par when compared to the Parliament’s text and previous versions of government positions. We lost forward-looking provisions for the protection of privacy while several surveillance measures have been added.”
The group said it will be pushing to restore requirements for service providers to protect online users’ privacy by default and for the establishment of clear rules against online tracking beyond cookies, among other policy preferences.
The Council, meanwhile, appears to be advocating for a highly diluted (and so probably useless) flavor of “do not track” — by suggesting users should be able to give consent to the use of “certain types of cookies by whitelisting one or several providers in their browser settings”, per the Commission.
“Software providers will be encouraged to make it easy for users to set up and amend whitelists on their browsers and withdraw consent at any moment,” it adds in its press release.
Clearly the devil will be in the detail of the Council’s position there. (The European Parliament has, by contrast, previously clearly endorsed a “legally binding and enforceable” Do Not Track mechanism for ePrivacy so, again, the stage is set for clashes.)
Encryption is another likely bone of ePrivacy contention.
As security and privacy researcher Dr Lukasz Olejnik noted back in mid 2017, the parliament strongly backed end-to-end encryption as a means of protecting the confidentiality of comms data — saying then that Member States should not impose any obligations on service providers to weaken strong encryption.
So it’s notable that the Council does not have much to say about e2e encryption — at least in the PR version of its public position. (A line in this that runs: “As a main rule, electronic communications data will be confidential. Any interference, including listening to, monitoring and processing of data by anyone other than the end-user will be prohibited, except when permitted by the ePrivacy regulation” is hardly reassuring, either.)
It certainly looks like a worrying omission given recent efforts at the Council level to advocate for “lawful” access to encrypted data. Digital and humans rights groups will be buckling up for a fight.