UK fines Clearview just under $10M for privacy breaches

The U.K.’s data protection watchdog has confirmed a penalty for the controversial facial recognition company, Clearview AI — announcing a fine of just over £7.5 million today for a string of breaches of local privacy laws.

The watchdog has also issued an enforcement notice, ordering Clearview to stop obtaining and using the personal data of U.K. residents that is publicly available on the internet; and telling it to delete the information of U.K. residents from its systems.

The U.S. company has amassed a database of 20 billion+ facial images by scraping data off the public internet, such as from social media services, to create an online database that it uses to power an AI-based identity-matching service which it sells to entities such as law enforcement. The problem is Clearview has never asked individuals whether it can use their selfies for that. And in many countries it has been found in breach of privacy laws.

In a statement accompanying today’s enforcement, the U.K.’s information commissioner, John Edwards, said:

Clearview AI Inc has collected multiple images of people all over the world, including in the U.K., from a variety of websites and social media platforms, creating a database with more than 20 billion images. The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the U.K. by both fining the company and issuing an enforcement notice.

People expect that their personal information will be respected, regardless of where in the world their data is being used. That is why global companies need international enforcement. Working with colleagues around the world helped us take this action and protect people from such intrusive activity.

This international cooperation is essential to protect people’s privacy rights in 2022. That means working with regulators in other countries, as we did in this case with our Australian colleagues. And it means working with regulators in Europe, which is why I am meeting them in Brussels this week so we can collaborate to tackle global privacy harms.

“Given the high number of UK internet and social media users, Clearview AI’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge,” the U.K. watchdog also wrote in a press release.

“Although Clearview AI no longer offers its services to U.K. organisations, the company has customers in other countries, so the company is still using personal data of UK residents,” it added.

The Information Commissioner’s Office (ICO) warned Clearview it might issue a financial penalty last fall, when it also ordered the U.S.-based company to stop processing U.K. citizens’ data and delete any data it held.

It confirmed those preliminary findings in today’s formal enforcement — finding Clearview in breach of a string of legal requirements.

Specifically, the ICO said Clearview failed to have a lawful basis for collecting people’s information; failed to use individuals’ information in a way that it fair and transparent, given people are not made aware or would not reasonably expect their personal data to be used for the purpose Clearview uses it for; failed to have a process in place to prevent the data being retained indefinitely; failed to meet higher data protection standards required for biometric data (aka, so-called ‘special category data’ under the EU General Data Protection Regulation and the U.K. GDPR); and, in a further breach, Clearview asked for additional personal information, including photos, when asked by members of the public if they are on its database — thereby impeding their data access rights. “This may have acted as a disincentive to individuals who wish to object to their data being collected and used,” the ICO noted on that.

One thing to note is the level of fine is considerably lower than the £17M+ the ICO announced last fall in its provisional order against Clearview. We asked the regulator about the reduction — and it told us that reductions following a notice of intent to fine may be related to representations from the company, which it may consider before deciding on whether to issue the organisation with a final monetary penalty notice.

The ICO also pointed to its Regulatory Action Policy — which it uses to determine the level of any financial penalties it levies.

The exact amount Clearview is fined may prove irrelevant if it refuses to pay.

International regulators have limited means to enforce privacy orders against foreign entities if they choose not to cooperate and lack a local representative an order can be enforced against.

Still, such sanctions do at least put limits on Clearview’s ability to expand internationally — as any local offices would be directly answerable to regulators in those markets.

Clearview was contacted for comment on the U.K. sanction. In a statement attributed to Lee Wolosky, partner at U.S. law firm Jenner and Block, the company said:

“While we appreciate the ICO’s desire to reduce their monetary penalty on Clearview AI, we nevertheless stand by our position that the decision to impose any fine is incorrect as a matter of law. Clearview AI is not subject to the ICO’s jurisdiction, and Clearview AI does no business in the U.K. at this time.”

Clearview also re-issued earlier remarks attributed to its CEO, Hoan Ton-That, expressing disappointment that “the UK Information Commissioner has misinterpreted my technology and intentions”.

The U.K. penalty is by no means the first international sanction for Clearview. The U.K. investigation was a joint procedure with Australia’s privacy watchdog which also ordered the company to stop processing citizens’ data and delete any info it held last year. France and Canada have also sanctioned the company. While Italy’s data protection regulator fined Clearview €20M in March.

On home turf, Clearview agreed to settle a 2020 lawsuit from the American Civil Liberties Union, earlier this month — which had accused it of breaching an Illinois law (the Biometric Information Privacy Act; BIPA) which bans use of individuals’ biometric data without consent.

The terms of the settlement appear to ban Clearview from selling or giving away access to its facial recognition database to private companies and individuals nationally in the U.S., although an exception for government contractors was included (but with a five year ban on providing to contractors within Illinois itself).

The settlement also requires Clearview to maintain an opt-out system to allow Illinois residents to block their likeness from its facial search results — and to end a controversial practice of providing free trials to police officers if those individuals don’t get approval through their departments to test the software.

However Clearview spun it as a win — suggesting it would respond by selling its algorithm to private companies in the U.S., instead of monetizing access to its database of scraped selfies.

This report was updated with information from the ICO and statements from Clearview