Leaked Google Data Makes Company More Transparent Than It Wants To Be

Data related to Google’s “Right to be forgotten” (RTBF) removal requests was just torn from the source code of its very own transparency report.

The Guardian reports that “less than 5% of nearly 220,000 individual requests made to Google to selectively remove links to online information concern criminals, politicians and high-profile public figures.”

What this means is that the vast majority of requests (95%) are indeed coming from regular members of the public looking to remove pieces of personal data from the web, rather than from criminals and corrupt politicians as a common narrative seems to have been. In several countries (France and Germany among them), the amount of requests related to private personal information requests hovered above 98%.

The released data also shows the percentage breakdown of the separate categories of requests. For instance, though the “granted” rate of private personal requests stands at 48%, the rate of approval is much lower (18%) for requests regarding serious crimes. In general, Google grants about half of all of the requests that it processes.

After releasing its transparency report last October detailing basic information on the number of RTBF requests received and granted, Google faced backlash for not revealing more details on the nature of the requests being approved and rejected.

In May, a group of 80 internet scholars penned an open letter to Google detailing their desire for them to release more data:

Beyond anecdote, we know very little about what kind and quantity of information is being delisted from search results, what sources are being delisted and on what scale, what kinds of requests fail and in what proportion, and what are Google’s guidelines in striking the balance between individual privacy and freedom of expression interests.

As more countries continue to adopt RTBF laws, an increase in transparency from Google is only going to grow more essential.

The data collected by The Guardian has since been removed from the source code of Google’s transparency report. Google was not immediately available for comment.

Update:

In a statement, a Google spokesperson told TechCrunch:

“We’ve always aimed to be as transparent as possible about our right to be forgotten decisions.  The data the Guardian found in our Transparency Report’s source code does of course come from Google, but it was part of a test to figure out how we could best categorise requests.  We discontinued that test in March because the data was not reliable enough for publication.  We are however currently working on ways to improve our transparency reporting.”