The Wikimedia Foundation, the not-for-profit organization behind Wikipedia, has strongly condemned the recent right to be forgotten (rtbf) ruling in Europe, warning the requirement to allow private individuals to request the de-indexing of links from search results associated with their name is going to have “critical repercussions” for its online crowdsourced encyclopedia.
The Foundation also stated its intention to oppose what it dubbed the “censorship of truthful information” stemming from the European Court of Justice ruling — on the grounds that it threatens the organization’s mission to provide ‘free access to the sum of all human knowledge’. It said it will therefore be posting notices about indefinite removals of links to Wikipedia articles when it is made aware of them.
Speaking at a press conference in London this morning, at which the Foundation was also launching its first Transparency Report, Wikipedia founder Jimmy Wales, Wikimedia CEO Lila Tretikov and the Foundation’s General Counsel Geoff Brigham (pictured above left) lined up to condemn the rtbf as compromising humans rights and freedom of expression.
“What drives Wikipedians all over the world are commitments to our core values, including transparency, privacy and freedom of expression,” said Wales. “The Wikimedia project, including Wikipedia, are founded on the belief that everyone everywhere can be able to have access to the sum of all knowledge, however this is only possible if people can contribute and participate in these projects without reservation — this means their right to create content, including controversial content, should be protected.
“People should feel secure that their curiosity and contributions are not subject to unreasonable government requests for their account histories, they should feel confident that the knowledge they receive is complete, truthful and uncensored.”
The Wikimedia Transparency Report reveals the organization received 56 requests for user data (i.e. details about contributors or the identity of individuals making particular edits to Wikimedia content) — from governments, corporations and individuals — between July 2012 and June 2014 . It granted 14% of those requests, although it also noted that it does not necessarily hold much information on contributors — with no requirement that contributors provide a name or email address, for instance.
The organization also received 304 requests for content alteration and takedown over the same period, granting none of these requests. However it also received 58 Takedown notices under the Digital Millennium Copyright Act, granting 41% of those.
The Foundation also revealed that in the last week it has received five notices from Google that the search engine has de-indexed a Wikipedia article in a search for a private individual’s name, under the European rtbf ruling. The notices cover more than 50 links directing readers to Wikipedia sites.
As part of its implementation of the ruling, Google has been informing publishers when it is de-indexing their content under the rtbf ruling — leading to a situation where news organisations have written fresh stories relating to de-indexed content, putting it back into the public domain (and having the opposite effect to the obscurity being sought under the rtbf ruling). It’s not clear whether other search engines are informing publishers in the same way — although it’s worth noting Google has massively dominant search market share in Europe, with circa 90% of the market.
Brigham did not go into details of the reasons underlying the rtbf requests the Wikimedia Foundation has been notified about by Google — noting that it is not party to those details — but he said the five instances cover “a variety of different kinds of articles”, including an article “about an individual who is allegedly serving time in person” and an “article about a criminal gang in Italy”.
“Demands that we erase content can be a direct threat to our mission,” added Tretikov. “Our Transparency Report explains how we fight, how do we defend against that. We oppose censorship… The removal of links from search results [under the rtbf ruling]… has compromised the public’s right to information and freedom of expression. Links including those to Wikipedia itself, may now be quietly, silently deleted with no transparency, no notice, no judicial review and no appeals process.”
“Some search engines are giving proper notice, and some are not,” she added. “We find this type of compelled censorship unacceptable. But we find the lack of disclosure unforgivable. This is not a tenable future. We cannot build the sum of all human knowledge without the world’s truthful sources, based on pre-edited histories. The ability of editors and contributors to find, deliver and improve the content of Wikimedia projects heavily depends on having search engines that provide the most accurate, relevant and truthful data.”
At the press conference, Wales was asked whether his role on Google’s advisory council on the rtbf signified a conflict of interests. He claimed it does not.
“I’m on an independent advisory board. Google has put together a group of experts, including some former privacy regulators, academics, me, we’re all volunteers so we don’t work for Google and are not beholden to Google. We’re going to be advising Google and we’re also going to advising the European Parliament on some suggested changes to the law,” he said.
“There is no conflict. When Google asked me to joint he panel they did so because I had already come out personally very strongly against this decision and I think it’s really important that when we have an advisory group coming together we need to hear from privacy regulators and advocates on that side, we need to hear from publishers, so we have a person on the panel from a newspaper, and we need to hear from the Internet community — from people like me who are speaking on behalf of a community like the Wikipedia community.”
The Foundation said its stance is generally supportive of privacy but not as conceived by the rtbf — with Tretikov arguing that the law in its current form is not “implementable”.
“There are some very difficult questions which are coming up around privacy, around information that’s leaked, around harassment, these kinds of things are serious social problems — this approach to trying to deal with those doesn’t strike me as particular fruitful,” added Wales.
“We’re talking about the censorship of links… to a legally published newspaper article about court proceedings [referring to the original Spanish case which triggered the ECJ ruling]. This is not harassment, this is not somebody stole your credit card data, this is not that sort of thing. However we may deal with those other issues, and I do think there are very difficult and challenging problems there, we shouldn’t start trying to deal with those by censoring Wikipedia and newspapers.”
Asked about a specific example where a private individual’s views have been misrepresented online, and are now inextricably associated with a Google search for their name, Wales suggested the solution for that individual is to fight “bad speech with more speech”.
“If people have accused him of things that are untrue that’s very different… that’s libel. And I do think people should have straightforward routes to deal with libel in society. That’s a whole tricky issue of course, but it’s a very different issue,” he said. “In terms of practical recommendations, I think what people should do is the best answer to bad speech is more speech. I think he should post a page detailing what his views actually are and how he’s been misrepresented there. I think that’s generally good advice.”
Wales went on to add that, in his view, it is the publishers of inaccurate information themselves that have a “moral requirement” to deal with problems associated with the dissemination of that misinformation.
On the question of the hierarchy of information created by search engines when they deliver their ordered sets of results — which foregrounds certain information and therefore has the potential to present a skewed view of a private individual — Wales said that’s an “editorial problem” for search engines to rectify, not something for the law to tackle.
“If Google is returning search results that are outdated when there’s more current information I think that’s an editorial problem for Google that they should fix and they work very hard to do that,” he said.
The Foundation was also challenged on Wikipedia’s practice of locking down certain articles to prevent edits at certain topical times — as is currently the case with a Wikipedia article about the 2014 Gaza-Israel conflict, for instance. Wales said the Foundation does not like doing this but said sometimes a temporary block on edits is required.
“Entries like that are locked temporarily, from time to time, when there’s excessive vandalism or an argument has broken out that’s become too emotional and too personal,” he said. “In cases like this we really don’t like locking anything — we think that Wikipedia should be an open, public space for thoughtful dialogue, for the gathering of reliable sources. Wikipedia should not be a battleground where people are trying to see who can win or lose a certain conflict.”