Poland opens privacy probe of ChatGPT following GDPR complaint

OpenAI is facing another investigation into whether its generative AI chatbot, ChatGPT, complies with European Union privacy laws.

Last month a complaint was filed against ChatGPT and OpenAI in Poland, accusing the company of a string of breaches of the EU’s General Data Protection Regulation (GDPR). Yesterday the Polish authority took the unusual step of making a public announcement to confirm it has opened an investigation.

“The Office for Personal Data Protection [UODO] is investigating a complaint about ChatGPT, in which the complainant accuses the tool’s creator, OpenAI, of, among other things, processing data in an unlawful, unreliable manner, and the rules under which this is done are opaque,” the UODO wrote in a press release [translated from Polish to English using DeepL].

The authority said it’s anticipating a “difficult” investigation — noting OpenAI is located outside the EU and flagging the novelty of the generative AI chatbot technology whose compliance it will be examining.

“The case concerns the violation of many provisions of the protection of personal data, so we will ask OpenAI to answer a number of questions in order to thoroughly conduct the administrative proceedings,” said Jan Nowak, president of the UODO, in a statement.

Deputy president, Jakub Groszkowski, added a warning to the authority’s press release — writing that new technologies do not operate outside the legal framework and must respect the GDPR. He said the complaint contains allegations that raise doubts about OpenAI’s systemic approach to European data protection principles, adding that the authority would “clarify these doubts, in particular against the background of the fundamental principle of privacy by design contained in the GDPR”.

The complaint, which was filed by local privacy and security researcher Lukasz Olejnik, accuses OpenAI of a string of breaches of the pan-EU regulation — spanning lawful basis, transparency, fairness, data access rights, and privacy by design.

It focuses on OpenAI’s response to a request by Olejnik to correct incorrect personal data in a biography ChatGPT generated about him — but which OpenAI told him it was unable to do. He also accuses the AI giant of failing to properly respond to his subject access request — and of providing evasive, misleading and internally contradictory answers when he sought to exercise his legal rights to data access.

The tech underlying ChatGPT is a so-called large language model (LLM) — a type of generative AI model that’s trained on masses of natural language data so it can both respond in a human like manner. But also, given the general purpose utility of the tool, it’s evidently been trained on all sorts of types of information so it can respond to different questions and asks — including, in many cases, being fed data about living people.

OpenAI’s scraping of the public Internet for training data, without people’s knowledge or consent, is one of the big factors that’s landed ChatGPT in regulatory hot water in the EU. Its apparent inability to articulate exactly how it’s processing personal data; or to correct mistakes when its AI “hallucinates” and produces false information about named individuals are others.

The bloc regulates how personal data is processed, requiring a processor has a lawful basis to collect and use people’s information. Processors must also meet transparency and fairness requirements. Plus a suite of data access rights are afforded to people in the EU — meaning EU individuals have (among other things) a right to ask for incorrect data about them to be rectified.

Olejnik’s complaint tests OpenAI’s GDPR compliance across a number of those dimensions. So any enforcement could be significant in shaping how generative AI develops.

Reacting to the UODO’s confirmation it’s investigating the ChatGPT complaint, Olejnik told TechCrunch: “Focusing on privacy by design/data protection by design is absolutely critical and I expected this to be the main aspect. So this sounds reasonable. It would concern the design and deployment aspects of LLM systems.”

He previously described the experience of trying to get answers from OpenAI about its processing of his information as feeling like Josef K, in Kafka’s book “The Trial.” “If this may be the Josef K. moment for AI/LLM, let’s hope that it may shed light on the processes involved,” he added now.

The relative speed with which the Polish authority is moving in response to the complaint, as well as its openness about the investigation, does look notable.

It adds to growing regulatory issues OpenAI is facing the European Union. The Polish investigation follows an intervention by Italy’s DPA earlier this year — which led to a temporary suspension of ChatGPT in the country. The scrutiny by the Garante continues, also looking into GDPR compliance concerns attached to factors like lawful basis and data access rights.

Elsewhere, Spain’s DPA has opened a probe. While a taskforce set up via the European Data Protection Board earlier this year is looking at how data protection authorities should respond to the AI chatbot tech with the goal of pushing to find some consensus among the bloc’s privacy watchdogs on how to regulate such novel tech.

The taskforce does not supplant investigations by individual authorities. But, in the future, it may lead to some harmonization in how DPAs approach regulating cutting edge AI. That said, divergence is also possible if there are strong and varied views among DPAs. And it remains to be seen what further enforcement actions the bloc’s watchdogs could take on tools like ChatGPT. (Or, indeed, how quickly they may act.)

In the UODO’s press release — which nods to the existence of the taskforce — its president says the authority is taking the ChatGPT investigation “very seriously”. He also notes the complaint’s allegations are not the first doubts vis-a-vis ChatGPT’s compliance with European data protection and privacy rules.

Discussing the authority’s openness and pace, Maciej Gawronski of law firm GP Partners, which is representing Olejnik for the complaint, told TechCrunch: “UODO is becoming more and more vocal about privacy, data protection, technology and human rights. So, I think, our complaint creates an opportunity for [it] to work on reconciling digital and societal progress with individual agency and human rights.

“Mind that Poland is a very advanced country regarding IT. I would expect UODO to be very reasonable in their approach and proceedings. Of course, as long as OpenAI remains open, for discussion.”

Asked if he’s expecting a quick decision on the complaint, Gawronski added: “The authority is monitoring technology advancements pretty closely. I am at UODO’s conference on new technologies at the moment. UODO has already been approached re AI by various actors. However, I do not expect a fast decision. Nor it is my intention to conclude the proceedings prematurely. I would prefer to have an honest and insightful discussion with OpenAI on what, when, how, and how much, regarding ChatGPT’s GDPR compliance, and in particular how to satisfy rights of the data subject.”

OpenAI was contacted for comment on the Polish DPA’s investigation but did not send any response.

The AI giant is not sitting still in response to an increasingly complex regulatory picture in the EU. It recently announced opening an office in Dublin, Ireland — likely with an eye on building towards streamlining its regulatory situation for data protection if it can funnel any GDPR complaints via Ireland.

However, for now, the US company is not considered “main established” in any EU Member State (including Ireland) for GDPR purposes, since decisions affecting local users continue to be taken at its US HQ in California. So far, the Dublin office is just a tiny satellite. This means data protection authorities across the bloc remain competent to investigate concerns about ChatGPT that arise on their patch. So more investigations could follow.

Complaints which predate any future main establishment status change for OpenAI could also still be filed anywhere in the EU.