Better know a CSO: Dropbox head of security Justin Berman

Justin Berman has one of the most important jobs at Dropbox.

As head of security, he oversees the company’s cybersecurity strategy, its defenses and works daily to keep its more than 600 million users’ data private and secure.

No pressure, then.

Berman joined the file storage and workspace giant a year ago during a period of transition for the company. During its early years, Dropbox was hit by a data breach that saw more than 60 million user passwords stolen during a time where tech giants were entrenched in a “move fast and break things” culture. But things have changed, particularly at Dropbox, which made good on its promise to improve the company’s security and also went far beyond what any Silicon Valley company had done before to better protect security researchers.

In this series, we’ll look at the role of the CSO — the chief security officer — at some of the biggest companies in tech to better understand the role, what it means to keep an organization secure without hindering growth and what advice startups can learn from some of the most experienced security professionals in the industry.

We start with Berman, who discussed in a recent interview what drew him to the company, what it means to be a security chief and what other companies can learn from Dropbox’s groundbreaking security policies.

This interview has been edited for length and clarity.

TechCrunch: You’ve been at Dropbox since June. Before this you were at Zenefits, Flatiron Health and Bridgewater. What brought you to Dropbox?

Justin Berman: First and foremost, I think the people here are amazing. And I think the problems I get to solve here are not the ones that a lot of security leaders find themselves solving. Because the company has had a historical commitment to security, privacy, and trust and risk, I’m not coming in and having to boot the culture of security from the ground up. That culture already exists. And the question we ask ourselves is how do we use that culture to do the right level of things as opposed to just doing as much as possible where you might slow down the business?

That’s refreshing to hear that, because a lot of people saying that it’s either “security or usability,” and that there’s little middle ground.

I think the first goal — the foundational responsibility — is that you have to be safe enough. And then the second thing you have to do is start thinking about how can I make sure that us being safe enough keeps the business moving fast at the same time? I think there’s a false dichotomy that’s been painted in the world for a long time that security and usability are inherently a problem. I think that oftentimes the approach to security is opposed to usability. But the more thoughtful approach to security keeps it simple and actually increases the speed at which a company can operate.

Since the 2012 data breach, Dropbox took security and ran with it. It set up a vulnerability disclosure program so that hackers and security researchers could find and submit security flaws and pays them for their work. Not only that, Dropbox was the first tech giant to provide “safe harbor” provisions for good-faith security researchers. In other words, it won’t sue (where others have). That helped inspire others like Tesla and Mozilla to follow suit. How has this helped Dropbox?

You get such a diversity of talent and opinions and you get such a diversity of understanding. A variety of eyes means that you’re more likely to catch the right things. Because often the reality is all people have a set of biases. They have things that they like working on, things they’re good at and they have things they’re confident in. So when you hire a couple of consultants or pen-testers… they’re going to find the things that they care about the most and they’re going to try and hit the things that they consider part of the playbook for running vulnerability assessments. But when you have a constantly open bug bounty, all of a sudden what you really have is a ton of people who have their own unique specialties and they’re all testing you and they’ll give you all the feedback that you can handle.

So in other words, having a diverse set of hackers and perspectives will help find the greatest number of issues. Tesla and Mozilla followed in Dropbox’s footsteps. How would you make the case for other companies to follow suit?

I find the price efficiency and the return on investment to be higher. Any security program has a maximum number of dollars that’s going to be given by the organization. I’m going to use a vulnerability disclosure program more often than I’m going to use a point-in-time consulting engagement.

I speak to CSOs (and CISOs — chief information security officers) all the time. By virtue of their job, chief security officers often fall the hardest in the event of a data breach or a security lapse. And security issues are inevitable — it’s a matter of when, not if. What kind of challenges do you face as a security chief?

One of the biggest challenges security leaders have in general is the lack of access to information that they need. It has more to do with the fact that you have a set of adversaries who care about your organization and care to attack you — whether it’s data or money — or whatever their objectives are. But even if you look at the current marketplace for threat intelligence, the best that you’re getting is a slightly data-informed guess. And so I’m constantly trying to find new sources of information to help better inform us so that we’re actually making the right investment.

So how do you measure success as a CSO?

I think it’s important that we redefine the world away from the “stop all breaches” approach to be broader. You have to make sure the organization is safe enough to continue to operate. A good CSO is a good business leader, not just a tech leader. I’m not just building a team of amazing security engineers, I actually have to help shepherd the organization at large. Really, you have to think about multiple metrics. Yes, of course there’s the “don’t get breached” metric. But given all the information we can pull in from the world about who our adversaries are and what they’re trying to do, how likely do we think that a breach is going to happen at Dropbox and how do I reduce that to a level where we can actually balance the risk that’s posed there, versus the risk of say not hiring or whatever else?

You’re constantly updating your view of what your adversaries are. It’s not a something that decreases over time, generally. As long as you’re continually updating your views and constantly refining your understanding, then the metric you’re aiming at gets harder and harder over time and that means the investments that you have to make increase. But you’re constantly modulating that based on things like, “is risk worth taking” or “is it not worth taking,” as opposed to “are we breached” versus “are we not breached yet?”

Dropbox’s 2012 breach was way before your time. It’s almost a decade later and clearly Dropbox’s security posture looks a lot better — if not least externally through the company’s open door policy of accepting bug reports and paying researchers for their findings.

What lessons has Dropbox learned from the data breach? And is it still something that’s used as a “lessons learned” in the company’s approach to security today?

I personally, as a CSO, shy away a lot from fear, uncertainty and doubt as a methodology for motivating the organization. Dropbox learned the lesson from that by integrating security into his culture, along with privacy and risk management. The thing I think is more interesting is whenever there’s a new large breach, it’s important that the security team retrospect internally about what that means for us. Do we feel resistant, or do we feel we have a problem waiting to happen?