CCPA won’t be enough to fix tech’s data entitlement problem

When the California Consumer Privacy Act (CCPA) rolled out on January 1st, many companies were still scrambling to become compliant with the data privacy regulation, which is estimated to cost businesses $55 billion. But even checking all of the compliance boxes isn’t enough to safeguard consumer data. The past few years of rampant breaches and data misuse have shown how quickly personal details can fall into the wrong hands. They’ve also shown how often simple user error enabled by poor data practices leads to big consequences.

The way to solve this issue isn’t solely through legislation — it’s companies taking a hard look at their behavior and processes. Laws like CCPA and GDPR help set the groundwork for change, but they don’t address the broader issue: businesses feel entitled to people’s data even when it’s not part of their core product offering and have encoded that entitlement into their processes.

Legislated and top-down calls for accountability won’t fix the problem on their own. To protect consumers, companies need to architect internal systems around data custodianship rather than data ownership. Doing so will establish processes that not only hit compliance benchmarks but make responsible data handling the default action.

Privacy compliance over true procedural change is a cop-out

The prevailing philosophy in Silicon Valley is one of data ownership, which impacts how consumers’ personal information is used. The consequences have been widely reported on everything from the revelations surrounding Cambridge Analytica to Uber’s 57-million-user data breach. Tech companies are losing the trust of customers, partners and governments around the world. In fact, Americans’ perception of tech companies has steadily dropped since 2015. More must be done to win it back.

Companies that rely on regulations like CCPA and GDPR to guide their data policies essentially ask someone else to draw the line for them, so they can come as close to it as possible — which leads to a “check-the-box” approach to compliance rather than a core philosophy that prioritizes the privacy expectations of their customers. If tech and security leaders build data policies with privacy in mind, we won’t have to spend valuable resources meeting government regulations.

How to take the entitlement out of data handling

Responsible, secure data handling is achievable for every company. The most important step is for businesses to go beyond the bare minimum when reevaluating their data access processes. What’s been most helpful for the companies I’ve worked with is organizing these practices around a simple idea: You can’t lose what you don’t have.

In practice, this idea is known as the Principle of Least Privilege, whereby companies give employees only the data access they need to do their jobs effectively. Here’s an example that applies to most customer-facing businesses out there: Say I’m a customer service rep and a person calls me about a problem with their account. If I operate according to the Principle of Least Privilege, the following data access rules would apply:

  1. I would only have access to that specific customer’s account information;
  2. I would only have access to the specific part of their account where the problem is happening;
  3. I would only have access until the problem is solved.

Sounds intuitive, right? Yet, many companies — particularly those operating without the Principle of Least Privilege in place — discovered through the GDPR and CCPA compliance process that their data access controls did not work this way. This is how major breaches happen. An employee downloads an entire database — much more data than they need to perform a specific task — their laptop is compromised, and suddenly hackers can access the entire database.

POLP works because it introduces a bit of friction into the data-request process. The goal here is to make the right decision easy and the wrong decision harder, so everyone is intentional about their data use. How a company achieves this will differ based on their business model and growth stage. One option is to have only a single database with an added layer of infrastructure that grants data access through POLP rules.

Alternatively, companies can work these rules into their CRM software. In the example I mentioned, the system would grant data access to a rep only when it recognizes a corresponding customer support case. If an employee tries to access data that is not directly tied to a customer problem, they would encounter an additional login step like two-factor authentication.

There’s no one-size-fits-all approach; rather, data access should operate on a spectrum. For one business, it may mean limiting data access to a single business account and the related set of customer information. At another company, an engineer may need access to multiple customers’ information to fix a product issue. When this happens, the data access should be both time-bound and highly visible, so that other employees can see how the data is used. There may also be times when an employee needs to access data in the aggregate to do their job — for example, to run a report. In this case, the data should always be anonymized.

Protecting consumer data is a moral obligation, not just a legal one

The power of privacy-focused data processes and a system like the Principle of Least Privilege is that, by design, they guide employees to use data with the customer’s best interest in mind. The Golden Rule should apply: We each must treat consumer data in the way we’d want our own data used. With the right functional procedures in place, infrastructure can make responsible data access intuitive.

No company is entitled to data; they are entrusted with it. Consumers must be aware of how their data is treated and hold companies accountable. Regulations like CCPA make this easier, but businesses must uphold their end of the bargain.

Trust, not data, is the most valuable currency for businesses today. But current data practices do nothing to earn that trust and we can’t count on regulation alone to change that. Only practices built with privacy and transparency in mind can bring back customer trust and keep personal data protected.