The U.K.’s Information Commissioner’s Office has criticized the draft Investigatory Powers bill warning about the risks of requiring communications service providers to weaken encryption, and also asserting that no clear case has been made for why the state should require data on all its citizens to be retained for a full year.
The IP bill is the government’s attempt to update and extend the surveillance capabilities of the security and intelligence agencies — replacing the long-in-the-tooth patchwork of legislation currently used to authorize intercepts with a clearer legal framework. It’s aiming to have a new law passed by the end of this year when emergency surveillance legislation, DRIPA, expires.
Giving oral evidence last week to the joint select committee currently examining the bill, information commissioner Christopher Graham was asked whether the bill gets the balance right, between privacy and security. “It’s very difficult to judge whether the bill gets the balance right,” he said. “Because the one thing we don’t have in the voluminous material that has been put before you is any real evidence, as opposed to the occasional anecdote, for the utility of the information that’s sought.
“The bill proposes that data can be required to be retained for 12 months but there’s no particular explanation of why 12 months — rather than six months or 18 months — is desirable because there is no indication of the use that such information has been put to over many months and years in the normal way of dealing with serious crime and terrorism.”
Parliament needs to recognize that the various data protection rights afforded to individuals, he continued, and be wary of “signing off a blank cheque” in regards to the security services’ appetite for information — arguing instead there should be a system of ongoing proportionality reviews, once the legislation has passed, to ensure data protection obligations continue to be met.
He went so far as to suggest that a rolling sunset clause or yearly renewal requirement be embedded within the legislation to enforce proportionality — and avoid the risks associated with data retention overreach. “Parliament renewed the Prevention of Terrorism Act year by year. I can’t see why we shouldn’t have a similar arrangement for something so fundamental as this bill,” he argued.
“Data protection is a fundamental right, under the charter of fundamental rights of the European Union, so I don’t think it’s a question of just signing off a blank cheque,” he added. “It is asserted that this information is very important for the detection of crime, and the prevention of terrorism, I think it would be sensible and wise for parliament to review, from time to time, how it’s working in practice. What use is being used of this great mass of data that will be required to be retained by communications service providers?”
Graham warned specifically of the “huge risk” of vast caches of retained information being exploited by “bad actors”, or otherwise leaking out because of the security challenges created by an ongoing requirement to store so much personal data.
Asked by the committee what sort of sanctions could be put in place to mitigate the risk of misuse of retained data by “rogue” individuals, such as within police forces or other organizations storing the data, Graham suggested parliament could enact a more deterrent-based penalty — such as a prison sentence, rather than the fine-only regime afforded by current legislation that pertains to this area.
But he again emphasized that too much retained data itself can generate too much risk. Ergo the best form of mitigation is to retain less data in the first place. “It merely underlines the point that when you require communications service providers to retain a massive collection of data for a year then it creates a risk. It’s there. People may do stupid things with it,” said Graham.
When you require communications service providers to retain a massive collection of data for a year then it creates a risk…People may do stupid things with it.
“[It’s] a whole pile of stuff which can get lost, inappropriately accessed from the criminal point of view and so on — and it’s because that risk is created by the legislation then you’ve got to have some very powerful safeguards to make sure the legislation is regularly reviewed, that it is being used for what it’s meant to be used for.”
On encryption, in its written evidence to the committee, the ICO also warns that “notices requiring the removal of electronic protection should not be permitted to lead to the removal or weakening of encryption”, given the risk to “the security of personal data generally”.
It specifically flags up clause 189 in the draft bill, noting that this permits the Secretary of State to impose obligations “relating to the removal of electronic protection applied by a relevant operator to any communications or data”.
“This could be a far reaching measure with detrimental consequences to the security of data and safeguards which are essential to the public’s continued confidence in the handling and use of their personal information,” the ICO writes, adding: “The practical application of such requirement in the draft is unclear in the draft bill and the accompanying Guide to Powers and Safeguards does not provide specific details to enable the full extent of the provision to be assessed.”
Last month Apple also raised concerns about the IP bill’s implications for encryption, writing in its own submission to the committee that: “The best minds in the world cannot rewrite the laws of mathematics. Any process that weakens the mathematical models that protect user data will by extension weaken the protection. And recent history is littered with cases of attackers successfully implementing exploits that nearly all experts either remained unaware of or viewed as merely theoretical.”
Five other Internet companies — Google, Microsoft, Twitter, Facebook and Yahoo — have also raised concerns about the implications of the proposed legislation on encryption, calling for more clarity in the language used in the bill in their own joint written submission.
“We reject any proposals that would require companies to deliberately weaken the security of their products via backdoors, forced decryption, or any other means. We therefore have concerns that the Bill includes “obligations relating to the removal of electronic protection applied by a relevant operator to any communication or data” and that these are explicitly intended to apply extraterritorially with limited protections for overseas providers,” they write.
“We appreciate the statements in the Bill and by the Home Secretary that the Bill is not intended to weaken the use of encryption, and suggest that the Bill expressly state that nothing in the Bill should be construed to require a company to weaken or defeat its security measures.”
The government has been accused of putting out mixed messages when it comes to its legislative intentions around encryption, with the Prime Minister last year appearing to suggest he wanted to ban encryption, before apparently backpedaling. The Home Secretary also subsequently appeared to make comments in support of encryption yet the wording of the legislation remains vague enough that concerns about its implications for encrypted services persist.
For example, many speakers at an event held to discuss various aspects of the IP bill last week expressed similar worries about vague language in the draft legislation leaving too much “open to interpretation”.
The joint select committee is continuing to take evidence from witnesses, and will hear from Home Secretary Theresa May tomorrow. It’s expected to file a report with recommendations by the middle of next month — suggestions that will doubtless feed into the coming months of debate as MPs and Peers in parliament and the House of Lords chew over the bill’s detail and try to achieve that sought for balance between security and privacy.