EU lawmakers under pressure to fully disclose dealings with child safety tech maker, Thorn

More trouble for European Union lawmakers in a controversial area of tech policymaking — namely the bloc’s proposed legislation to apply surveillance technologies, such as client-side scanning, to digital messaging to try to detect child sexual abuse material (CSAM).

This week the Commission’s ombudsman published details of a finding it made in December of maladministration over a decision by the EU’s executive not to release fuller information pertaining to its communications with a child safety tech maker. Last year the Commission released some documents relating to its exchanges with the company in question but denied access to others.

The recommendation follows a June 2022 complainant to the ombudsman, made by a journalist, who had requested public access to documents sent to the Commission by Thorn, a US entity that sells AI technologies it claims can detect and remove CSAM.

In her recommendation, the EU’s ombudsman, Emily O’Reilly, urges the Commission to “reconsider its decision with a view to giving significantly increased, if not full, public access to the documents at issue”.

“In light of the related ongoing legislative procedure and the resulting time-sensitivity of this case, the ombudsman urged the Commission to implement her recommendation swiftly,” she adds.

The Commission presented its original proposal for a legal framework that could obligate digital services to use automated technologies to detect and report existing or new CSAM, and also identify and report grooming activity targeting kids on their platforms, back in May 2022. But the file remains under active negotiations by its EU co-legislators, the European Parliament and Council — a factor the ombudsman flags as an important consideration for applying transparency to drive accountability around EU lawmaking.

Disclosure of the documents at issue “will enable the public to participate more effectively in a decision-making process that will very likely directly affect citizen’s day-to-day life by limiting their right to privacy,” she suggests. “Secondly, transparency will allow the public to scrutinise who and what informed the legislative proposal in question. Stakeholders who actively provide input should not be allowed to do so behind closed doors.”

Critics have suggested the Commission’s controversial message-scanning proposal has been unduly influenced by lobbyists promoting proprietary child safety tech who stand to benefit commercially from laws mandating automated CSAM checks.

Last fall, a seminar organised by the European Data Protection Supervisor also heard a wide range of concerns that the Commission proposal is likely to be both ineffective as a tool to fight child sexual abuse and a major risk to fundamental freedoms in a democratic society.

Since then, parliamentarians have backed a revised approach for combating CSAM that would remove the requirement for messaging platforms to scan end-to-end encrypted messages, among other limits. But EU legislation is a three way affair — requiring buy in from the Commission and Council, too. So it remains to be seen where the CSAM file will land.

Asked on Monday about the ombudsman’s recommendation that the Commission release more of its exchanges with Thorn the EU’s executive took until today (Wednesday) to send us a brief reply (see below). Its response suggests it plans to take its time to chew over the ombudsman’s finding of maladministration, given it makes a point of flagging a generous deadline for it to respond to her recommendations that’s more than two months hence. Which doesn’t suggest a swift resolution incoming. More it smacks of a can being kicked down the road.

Here’s the statement, attributed to European Commission spokesperson for Home Affairs, Anitta Hipper:

The Commission will provide access to documents as appropriate and within our legal framework. Specifically, as regards the Ombudsman recommendation, the Commission will carefully consider the recommendation of the Ombudsman. A reply is due by March 19.

The legislative proposal has already caused another internal controversy for the Commission. Last year it ran into hot water over microtargeted ads its home affairs division was spotted running on the social network X to promote the legislation — leading to a number of data protection complaints as data used for targeting appeared to include sensitive personal information.

In November, privacy rights group noyb filed a complaint against the Commission over this to its privacy oversight body, the European Data Protection Supervisor.

An internal investigation opened by the Commission in the wake of reporting of the episode, meanwhile, has yet to produce any public results. Each time we’ve asked the Commission about this probe it has said it does not have an update.

However the existence of the internal investigation has had one tangible result: The EU ombudsman declined to open an investigation into the microtargeting following a complaint by MEP Patrick Breyer last October — in her response to the MEP O’Reilly pointed to the Commission’s ongoing probe as “sufficient grounds” for her not to investigate at that point, writing: “I note that the Commission has explained in the media that internal investigations are ongoing. Therefore, for the moment, I do not find sufficient grounds to open an inquiry.”

At the same time, she did agree to open an investigation into the transfer of two staffers from Europol, a pan-EU law enforcement coordinating agency, to Thorn — following another complaint by Breyer over a potential conflict of interest.

“I have decided to open an inquiry to investigate how Europol dealt with the moves of two former staff members to positions related to combatting online child sexual abuse,” she wrote. “As a first step, I have decided that it is necessary to inspect certain documents held by Europol related to these post-service activities. I expect to receive these documents by January 15, 2024.”

It remains to be seen what the ombudsman’s investigation of Europol’s comms with Thorn will conclude. (But there is, perhaps, no small irony that additional controversy around the Commission’s message-scanning proposal is being stoked by access to (and/or, well, lack of it) ‘private’ comms passing between EU institutions and industry lobbyists. Possibly there’s a message in there for policymakers if they could but read it.)

We reached out to Thorn but it did not respond to a request for comment about the ombudsman’s inquiry.

A piece of investigative journalism published by BalkanInsight last fall, looking into Thorn’s lobbying and reporting on comms between the Commission and Thorn that its journalists were able to obtain, questioned the level of influence commercial child safety tech makers that stand to cash in on laws mandating message scanning have acquired over EU policymaking.

“After seven months of communication concerning access to documents and the intervention of the European ombudsman, in early September the Commission finally released a series of email exchanges between Johansson’s Directorate-General for Migration and Home Affairs and Thorn,” its journalists reported. “The emails reveal a continuous and close working relationship between the two sides in the months following the roll out of the CSAM proposal, with the Commission repeatedly facilitating Thorn’s access to crucial decision-making venues attended by ministers and representatives of EU member states.”

The EU commissioner spearheading the CSAM-scanning proposal, home affairs commissioner, Ylva Johansson, has repeatedly rejected claims she allowed industry lobbyists to influence her proposal.

Follow-up reporting by BalkanInsight last year, citing minutes released under freedom of information, found Europol officials had pushed in a meeting with Commission staff for unfiltered access to data which would be obtained under the CSAM-scanning proposal; and for the scanning systems to be used to detect other types of crime, not just child sexual abuse.

Critics of the controversial EU CSAM-scanning proposal have long warned that once surveillance tech is embedded into private messaging infrastructure there will be pressure from law enforcement agencies to expand the scope of what’s being scanned for.