Security

Should tech giants slam the encryption door on the government?

Comment

Image Credits: Bryan Thomas (opens in a new window) / Getty Images

Reuters reported yesterday, citing six sources familiar with the matter, that the FBI pressured Apple into dropping a feature that would allow users to encrypt iPhone backups stored in Apple’s cloud.

The decision to abandon plans to end-to-end encrypt iCloud-stored backups was reportedly made about two years ago. The feature, if rolled out, would have locked out anyone other than the device owner — including Apple — from accessing a user’s data. In doing so, it would have made it more difficult for law enforcement and federal investigators, warrant in hand, to access a user’s device data stored on Apple’s servers.

Reuters said it “could not determine exactly” why the decision to drop the feature was made, but one source said “legal killed it,” referring to the company’s lawyers. One of the reasons that Apple’s lawyers gave, per the report, was a fear that the government would use the move as “an excuse for new legislation against encryption.”

It’s the latest in a back and forth between Apple and the FBI since a high-profile legal battle four years ago, which saw the FBI use a little-known 200-year-old law to demand the company create a backdoor to access the iPhone belonging to the San Bernardino shooter. The FBI’s case against Apple never made it to court, after the bureau found hackers who were able to break into the device, leaving in legal limbo the question of whether the government can compel a company to backdoor their own products.

The case has prompted debate — again — whether or not companies should build technologies that lock out law enforcement from data, even when they have a warrant.

TechCrunch managing editor Danny Crichton says companies shouldn’t make it impossible for law enforcement to access their customers’ data with a warrant. Security editor Zack Whittaker disagrees, and says it’s entirely within their right to protect customer data.


Zack: Tech companies are within their rights — both legally and morally — to protect their customers’ data from any and all adversaries, using any legal methods at their disposal.

Apple is a great example of a company that doesn’t just sell products or services, but one that tries to sell you trust — trust in a device’s ability to keep your data private. Without that trust, companies cannot profit. Companies have found end-to-end encryption is one of the best, most efficient and most practical ways of ensuring that their customers’ data is secured from anyone, including the tech companies themselves, so that nobody other than the owner can access it. That means even if hackers break into Apple’s servers and steal a user’s data, all they have is an indecipherable cache of data that cannot be read.

But the leaks from last decade which revealed the government’s vast surveillance access to their customers data prompted the tech companies to start seeing the government as an adversary — one that will use any and all means to acquire the data it wants. Companies are taking the utilitarian approach of giving their customers as much security as they can. That is how you build trust — by putting that trust directly in the hands of the customer.


Danny: Zack is right that trust is critical between technology companies and users — certainly the plight of Facebook the past few years bears that out. But there also has to be two-way trust between people and their government, a goal thwarted by end-to-end encryption.

No one wants the government poking their heads into our private data willy-nilly, scanning our interior lives seeking out future crimes à la “Minority Report.” But as citizens, we also want to empower our government with certain tools to make us safer — including mechanisms such as the use of search warrants to legally violate a citizen’s privacy with the authorization of the judiciary to investigate and prosecute suspected crimes.

In the past, the physical nature of most data made such checks-and-balances easy to enforce. You could store your private written notebooks in a physical safe, and if a warrant was issued by an appropriate judge, the police could track down that safe and drill it open if necessary to access the contents inside. Police had no way to scan all the private safes in the country, and so users had privacy with their data, while the police had reasonable access to seize that data when certain circumstances authorized them to do so.

Today, end-to-end encryption completely undermines this necessary judicial process. A warrant may be issued for data stored on let’s say iCloud, but without a suspect’s cooperation, the police and authorities may have no recourse to seize data they legally are allowed to acquire as part of their investigation. And it’s not just law enforcement — the evidential discovery process at the start of any trial could similarly be undermined. A judiciary without access to evidence will be neither fair nor just.

I don’t like the sound or idea of a backdoor anymore than Zack does, not least because the technical mechanisms of a backdoor seem apt for hacking and other nefarious activities. However, completely closing off legitimate access to law enforcement could make entire forms of crime almost impossible to prosecute. We have to find a way to get the best of both worlds.


Zack: Yes, I want the government to be able to find, investigate and prosecute criminals. But not at the expense of our privacy or by violating our rights.

The burden to prosecute an individual is on the government, and the Fourth Amendment is clear. Police need a warrant, based on probable cause, to search and seize your property. But a warrant is only an authority to access and obtain information pursuant to a crime. It’s not a golden key that says the data has to be in a readable format.

If it’s really as difficult for the feds to gain access to encrypted phones as they say it is, it needs to show us evidence that stands up to scrutiny. So far the government has shown it can’t act in good faith on this issue, nor can it be trusted. The government has for years vastly artificially inflated the number of encrypted devices it said it can’t access. It has also claimed it needs the device makers, like Apple, to help unlock devices when the government has long already had the means and the technologies capable of breaking into encrypted devices. And the government has refused to say how many investigations are actively harmed by encrypted devices that can’t be unlocked, effectively giving watchdogs no tangible way to adequately measure how big of a problem the feds claim it is.

But above all else, the government has repeatedly failed to rebut a core criticism from security engineers and cryptography experts that a “backdoor” designed only for law enforcement to access would not inadvertently get misused, lost or stolen and exploited by nefarious actors, like hackers.

Encryption is already out there, there’s no way the encryption genie will ever float its way back into the bottle. If the government doesn’t like the law, it has to come up with a convincing argument to change the law.


Danny: I go back to both of our comments around trust — ultimately, we want to design systems built on that foundation. That means knowing that our data is not being used for ulterior, pecuniary interests by tech companies, that our data isn’t being ingested into a massive government tracking database for broad-based population surveillance and that we ultimately have reasonable control over our own privacy.

I agree with you that a warrant simply says that the authorities have access to what’s “there.” In my physical safe example, if a suspect has written their notes in a coded language and stored them in the safe and the police drill it open and extract the papers, they are no more likely to read those notes than they are the encrypted binary files coming out of an end-to-end encrypted iCloud.

That said, technology does allow scaling up that “coded language” to everyone, all the time. Few people consistently encoded their notes 30 years ago, but now your phone could potentially do that on your behalf, every single time. Every single investigation — again, with a reasonable search warrant — could potentially be a multi-step process just to get basic information that we otherwise would want law enforcement to know in the normal and expected course of their duties.

What I’m calling for then is a deeper and more pragmatic conversation about how to protect the core of our system of justice. How do we ensure privacy from unlawful search and seizure, while also allowing police access to data (and the meaning of that data, i.e. unencrypted data) stored on servers with a legal warrant? Without a literal encoded backdoor prone to malicious hacking, are there technological solutions that might be possible to balance these two competing interests? In my mind, we can’t have and ultimately don’t want a system where fair justice is impossible to acquire.

Now as an aside on the comments about data: The reality is that all justice-related data is complicated. I agree these data points would be nice to have and would help make the argument, but at the same time, the U.S. has a decentralized justice system with thousands of overlapping jurisdictions. This is a country that can barely count the number of murders, let alone other crimes, let alone the evidentiary standards related to smartphones related to crimes. We are just never going to have this data, and so in my view, an opinion of waiting until we have it is unfair.


Zack: The view from the security side is that there’s no flexibility. These technological solutions you think of have been considered for decades — even longer. The idea that the government can dip into your data when it wants to is no different from a backdoor. Even key escrow, where a third-party holds onto the encryption keys for safe keeping, is also no different from a backdoor. There is no such thing as a secure backdoor. Something has to give. Either the government stands down, or ordinary privacy-minded folk give up their rights.

The government says it needs to catch pedophiles and serious criminals, like terrorists and murderers. But there’s no evidence to show that pedophiles, criminals and terrorists use encryption any more than the average person.

We have as much right to be safe in our own homes, towns and cities as we do to privacy. But it’s not a trade-off. Everyone shouldn’t have to give up privacy because of a few bad people.

Encryption is vital to our individual security, or collective national security. Encryption can’t be banned or outlawed. Like the many who have debated these same points before us, we may just have to agree to disagree.

The US government should stop demanding tech companies compromise on encryption

More TechCrunch

“Late Pledge” allows campaign creators to continue collecting money even after the campaign has closed.

Kickstarter now lets you pledge after a campaign closes

Stack AI’s co-founders, Antoni Rosinol and Bernardo Aceituno, were PhD students at MIT wrapping up their degrees in 2022 just as large language models were becoming more mainstream. ChatGPT would…

Stack AI wants to make it easier to build AI-fueled workflows

Pinecone, the vector database startup founded by Edo Liberty, the former head of Amazon’s AI Labs, has long been at the forefront of helping businesses augment large language models (LLMs)…

Pinecone launches its serverless vector database out of preview

Young geothermal energy wells can be like budding prodigies, each brimming with potential to outshine their peers. But like people, most decline with age. In California, for example, the amount…

Special mud helps XGS Energy get more power out of geothermal wells

The market play is clear from the outset: The $449 headphones are firmly targeted at an audience that would otherwise be purchasing the Bose QC Ultra or Apple AirPods Max.

Sonos finally made some headphones

Adobe says the feature is up to the task, regardless of how complex of a background the object is set against.

Adobe brings Firefly AI-powered Generative Remove to Lightroom

All cars suffer when the mercury drops, but electric vehicles suffer more than most as heaters draw more power and batteries charge more slowly as the liquid electrolyte inside thickens.…

Porsche invests in battery startup South 8 to boost cold-weather EV performance

Scale AI has raised a $1 billion Series F round from a slew of big-name institutional and corporate investors including Amazon and Meta.

Data-labeling startup Scale AI raises $1B as valuation doubles to $13.8B

The new coalition, Tech Against Scams, will work together to find ways to fight back against the tools used by scammers and to better educate the public against financial scams.

Meta, Match, Coinbase and others team up to fight online fraud and crypto scams

It’s a wrap: European Union lawmakers have given the final approval to set up the bloc’s flagship, risk-based regulations for artificial intelligence.

EU Council gives final nod to set up risk-based regulations for AI

London-based fintech Vitesse has closed a $93 million Series C round of funding led by investment giant KKR.

Vitesse, a payments and treasury management platform for insurers, raises $93M to fuel US expansion

Zen Educate, an online marketplace that connects schools with teachers, has raised $37 million in a Series B round of funding. The raise comes amid a growing teacher shortage crisis…

Zen Educate raises $37M and acquires Aquinas Education as it tries to address the teacher shortage

“When I heard the released demo, I was shocked, angered and in disbelief that Mr. Altman would pursue a voice that sounded so eerily similar to mine.”

Scarlett Johansson says that OpenAI approached her to use her voice

A new self-driving truck — manufactured by Volvo and loaded with autonomous vehicle tech developed by Aurora Innovation — could be on public highways as early as this summer.  The…

Aurora and Volvo unveil self-driving truck designed for a driverless future

The European venture capital firm raised its fourth fund as fund as climate tech “comes of age.”

ETF Partners raises €285M for climate startups that will be effective quickly — not 20 years down the road

Copilot, Microsoft’s brand of generative AI, will soon be far more deeply integrated into the Windows 11 experience.

Microsoft wants to make Windows an AI operating system, launches Copilot+ PCs

Hello and welcome back to TechCrunch Space. For those who haven’t heard, the first crewed launch of Boeing’s Starliner capsule has been pushed back yet again to no earlier than…

TechCrunch Space: Star(side)liner

When I attended Automate in Chicago a few weeks back, multiple people thanked me for TechCrunch’s semi-regular robotics job report. It’s always edifying to get that feedback in person. While…

These 81 robotics companies are hiring

The top vehicle safety regulator in the U.S. has launched a formal probe into an April crash involving the all-electric VinFast VF8 SUV that claimed the lives of a family…

VinFast crash that killed family of four now under federal investigation

When putting a video portal in a public park in the middle of New York City, some inappropriate behavior will likely occur. The Portal, the vision of Lithuanian artist and…

NYC-Dublin real-time video portal reopens with some fixes to prevent inappropriate behavior

Longtime New York-based seed investor, Contour Venture Partners, is making progress on its latest flagship fund after lowering its target. The firm closed on $42 million, raised from 64 backers,…

Contour Venture Partners, an early investor in Datadog and Movable Ink, lowers the target for its fifth fund

Meta’s Oversight Board has now extended its scope to include the company’s newest platform, Instagram Threads, and has begun hearing cases from Threads.

Meta’s Oversight Board takes its first Threads case

The company says it’s refocusing and prioritizing fewer initiatives that will have the biggest impact on customers and add value to the business.

SeekOut, a recruiting startup last valued at $1.2 billion, lays off 30% of its workforce

The U.K.’s self-proclaimed “world-leading” regulations for self-driving cars are now official, after the Automated Vehicles (AV) Act received royal assent — the final rubber stamp any legislation must go through…

UK’s autonomous vehicle legislation becomes law, paving the way for first driverless cars by 2026

ChatGPT, OpenAI’s text-generating AI chatbot, has taken the world by storm. What started as a tool to hyper-charge productivity through writing essays and code with short text prompts has evolved…

ChatGPT: Everything you need to know about the AI-powered chatbot

SoLo Funds CEO Travis Holoway: “Regulators seem driven by press releases when they should be motivated by true consumer protection and empowering equitable solutions.”

Fintech lender SoLo Funds is being sued again by the government over its lending practices

Hard tech startups generate a lot of buzz, but there’s a growing cohort of companies building digital tools squarely focused on making hard tech development faster, more efficient and —…

Rollup wants to be the hardware engineer’s workhorse

TechCrunch Disrupt 2024 is not just about groundbreaking innovations, insightful panels, and visionary speakers — it’s also about listening to YOU, the audience, and what you feel is top of…

Disrupt Audience Choice vote closes Friday

Google says the new SDK would help Google expand on its core mission of connecting the right audience to the right content at the right time.

Google is launching a new Android feature to drive users back into their installed apps

Jolla has taken the official wraps off the first version of its personal server-based AI assistant in the making. The reborn startup is building a privacy-focused AI device — aka…

Jolla debuts privacy-focused AI hardware