Wikipedia founder Jimmy Wales has spoken out against a controversial ruling by the European Court of Justice that requires Google to consider information removal requests from individuals whose data its search engine has indexed.
In comments emailed to TechCrunch, Wales described the ruling as censorship of knowledge — pure and simple. “In the case of truthful, non-defamatory information obtained legally, I think there is no possibility of any defensible ‘right’ to censor what other people are saying,” he said.
He also warned the ruling may make it more difficult to make “real progress on privacy issues.”
“We have a typical situation where incompetent politicians have written well-meaning but incoherent legislation without due consideration for human rights and technical matters,” Wales added.
The ruling by the ECJ last month required Google to quickly put in place a process for fielding requests from people who believe that information it has indexed about them is outdated or irrelevant and should be removed. The ruling enforces European data protection legislation dating back to 1995.
There is no ‘right to be forgotten’ — there is apparently a ‘right’ in Europe to censor some information that you don’t like. Jimmy Wales
Google’s webform where people can submit requests for the removal of information went up at the end of May. As of last Monday, the company said it had received more than 41,000 requests, with some 12,000 filed on day one. So you can’t say there’s no appetite for this piece of legislation.
Google said it will be weighing each individual’s right to be forgotten with the public’s right to know whatever it is they want deleted before taking any action on a request.
In an interview with the FT last month, Google’s Larry Page revealed a breakdown of initial requests for data removal in the two weeks after the ECJ ruling (and prior to the webform being put in place). Google said 31 percent of requests received at that point related to fraud/scam; 20 percent to arrests/convictions for violent or serious crime; 12 percent to child pornography arrests; 5 percent to government or the police; 2 percent to celebrities; and an additional 30 percent were referred to by Google as “other.”
If that proportion has held up, around a third — some 13,500 requests — of the 41,000 would fall into a category that does not encompass fraud, serious/violent crime, government or celebrities.
In addition to launching a formal process to comply with the ECJ ruling, Google has announced an advisory committee — drawing in outside expertise including from free-speech advocates, lawyers and ethics professors to work with it on weighing the issues. Wales has agreed to be a member of this committee.
Other non-Googlers on the committee named so far are: Frank La Rue (UN Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression); Peggy Valcke (Director, University of Leuven law school); Jose Luis Piñar (former Spanish DPA, now an academic); and Luciano Floridi (information ethics philosopher at Oxford Internet Institute).
TechCrunch asked Wales what the committee’s role would be, and more broadly for his views on the right to be forgotten. See below for our Q&A.
Back in 2005, Wales admitted to editing his own Wikipedia bio — a practice frowned upon by the site. And, it must be said, a little eyebrow-raising given his outspoken views on the imperative of free speech online. Wales passed up on the opportunity to explain how his own views about personal revisionist history might have evolved since 2005. But he did agree to answer our other questions about the committee’s role and his personal views on the ECJ ruling.
He has previously dubbed the ruling “ridiculous” and “very bizarre.” In comments to TechCrunch he went further, calling it “a deep injustice and terrible danger in European law.”
TC: When did Google reach out to you to ask it to join its advisory committee (or did you reach out to Google?)
Wales: They called me last Wednesday [May 28] I believe it was.
TC: What will your role — and the role of the committee — be specifically? Has the committee started work already?
Wales: There has been some misreporting about this under the assumption that we’ll be the ones deciding what to censor in Google. That’s wrong.
We will not be making decisions on individual requests.
The remit of the committee is to hold public hearings and issue recommendations — not just to Google but to legislators and the public. We have a typical situation where incompetent politicians have written well-meaning but incoherent legislation without due consideration for human rights and technical matters.
TC: What has Google said to you/the committee about how it intends to weigh up requests at this point?
Wales: We haven’t even had our first meeting — we have not begun that dialog yet.
TC: Will more members of the committee be added over time, or is the current line-up intended as the full complement?
Wales: Best to ask Google — I don’t know!
TC: What’s your response to criticism of the balance of views represented on the committee? Has Google too closely selected individuals whose views align with its own? Could there be more balance?
Wales: I only agreed to join on the grounds that we will have a balance of views. Speaking only for myself, though, I don’t think the view that this decision is correct is in any way consistent with any serious person’s understanding of freedom of expression. So unless we are willing to seek out authoritarian dictators to join the panel, it is extremely unlikely that any member of the committee will say that this ruling is correct.
TC: What is your view on the European Court of Justice’s “right to be forgotten” ruling, and — more broadly — on the individual’s right to privacy online? Should online users have any rights to request specific data about them is removed from the public domain?
Wales: I think the decision will have no impact on people’s right to privacy, because I don’t regard truthful information in court records published by court order in a newspaper to be private information. If anything, the decision is likely to simply muddle the interesting philosophical questions and make it more difficult to make real progress on privacy issues.
In the case of truthful, non-defamatory information obtained legally, I think there is no possibility of any defensible “right” to censor what other people are saying.
It is important to avoid language like “data” because we aren’t talking about “data” — we are talking about the suppression of knowledge.
TC: Editing information is a core part of Wikipedia. Why in your view is the ability to erase data justified and important there but problematic on an indexer/aggregator like google? Wikipedia can, after all, also act as an information aggregator like Google, and people do use Google to find something out in the same manner that they might consult Wikipedia…
Wales: I don’t view these two cases as different philosophically at all! You do not have a right to use the law to prevent Wikipedia editors from writing truthful information, nor do you have a right to use the law to prevent Google from publishing truthful information. Wikipedia can and should work hard to do a good job, just as Google can and should work hard to do a good job.
This is key to understanding the issue: no one is (yet) proposing censoring true information directly from Wikipedia. But they are proposing censoring links to true information directly from Google.
TC: What do you think of the initial webform process Google has created to comply with the right to be forgotten ruling? Critics suggest it is not complying with the spirit of “right to be forgotten” by flagging up the fact that information is being removed. And that it is making the user do all the legwork by requiring they submit all the URLs they wish are removed, when Google’s own algorithms could actually shortcut that. Is Google being fair?
Wales: Those two objections are ridiculous. There is no “right to be forgotten” — there is apparently a “right” in Europe to censor some information that you don’t like.
And the idea that Google’s own algorithms could see into someone’s mind to determine what they think is “irrelevant or outdated” is magical thinking. Google is not magical. If you don’t tell them how you want to censor them, how are they to guess?
TC: In your view, what more generally could Google be doing to balance its business processes against considerations about the individual privacy of users?
Wales: I’m not very interested in that aspect of things in this context. I’m only interested here in setting right a deep injustice and terrible danger in European law.
TC: The European Commission is currently reforming its data protection legislation. What would be the ideal outcome of that process be in your view?
Wales: A part of the outcome should be the very strong implementation of a right to free speech in Europe — essentially the language of the First Amendment in the U.S.