Editor’s note: Arsalan Farooq is CEO and founder of Convirture.
So goes the seemingly endless parade of headlines that present the ways in which cloud and online computing has betrayed us: NSA snooping inside the cloud; security breaches like Apple’s iCloud; foreign espionage/cyber attacks; OpenSSL/Heartbleed; Shellshock Bashbug; SSL/POODLE; Dropbox passwords.
While any single incident can be dismissed as a “one-off” problem, collectively, the incidents send a message to IT managers and the general public: The cloud cannot be secured and therefore cannot be trusted. The bad guys just move too fast and always stay one step ahead.
As a result, the public and IT managers alike now increasingly hold organizations accountable for the security of data in the cloud. And in light of NSA surveillance, corporate IT buyers, especially in Europe, are taking a dimmer view of any cloud-computing assets located on American soil or owned by American providers.
Soon, cloud customers could decide they’ve had enough. Better to lock things down as tight as possible inside one’s own data center than to trust your business, personal data or reputation to a nebulous cloud, the porosity of which varies from provider to provider and from week to week.
Looked at through this lens, the cloud could be poised to go down in flames like so many hot tech concepts that preceded it (think SOA, ASP, buying pet food online).
But the numbers tell a different story: According to IDC, public cloud service alone will hit $107 billion in 2017. That’s because, despite all the concerns, the cloud makes sense from the perspective of economics, technology and culture.
From a technology standpoint, the cloud is easy to use, fast to scale and flexible as an “on-demand” solution. As for security, the blame often falls on everything from sloppy implementation to poor password hygiene – problems not unique to cloud computing.
Economically, the cloud gives organizations a way to replace the high capital cost of establishing and maintaining a data center with a subscription model. That’s too compelling to pass up, at least for now.
Culturally, technology has always had an adoption curve that puts it ahead of public acceptance. For example, when browser “cookies” first became widely known in the mid-90s, reports filled the press about their invasion of our privacy.
The U.S. Federal Trade Commission held hearings about them. People wrote to their elected officials. Today, cookies are as much a part of the web experience as the browser itself. We have accepted the incremental loss of privacy as part of having access to the World Wide Web essentially at no dollar cost.
A bigger cultural question worthy of its own discussion is whether people brought up in an all-digital, always-connected world, have the inclination to question the security implications of that world.
If you’re less than 30 years old, you can’t remember a time without the Internet. Not too long ago, it was only technology professionals that were interacting with computers. Now it’s everyone. And even the best technology can’t account for the weakest point of failure in any technological system: the end user. If the underlying technology is not keeping up with today’s demands, how would anyone know that, especially someone for whom “online” has always been the norm?
The same cultural shift is happening with cloud computing. Not so long ago, businesses could not conceive of running IT resources anywhere but in a physical data center that they owned or “rented” from a provider.
Today, tens of thousands of businesses are forgoing the purchase of their own computing, storage and application resources and just “getting” these from the cloud or “placing” them in the cloud. From CIOs to security chiefs and compliance officers to IT staff to business line workers, the “where” and “how” no longer matter.
Only “what” and “what it costs” matter. As part of that cost, providers and consumers need to accept that their data may not be 100 percent secure (as we know, nothing is) and/or private but it may be secure/private enough to make economic sense.
Another broader cultural shift affects the adoption of cloud computing: the move to the “pay as you go” economy. In greater numbers, people (especially the sub-30-year-old set) are renting houses instead of buying, sharing cars instead of owning, streaming music subscriptions rather than buying CDs and downloads. The “use what you need” mentality in the mainstream population will translate into a cloud computing model for startups and more established businesses alike.
That’s why the cloud isn’t going anywhere and in fact will continue to grow as a critical IT infrastructure tool. We may have reached the end of “Cloud 1.0,” because, as we know too well, nothing in technology ever remains the same. But cloud breaches such as the iCloud photo situation or widely publicized storage hacks will only cause technology providers and end-users to develop better systems and increase their vigilance. Behavior will adapt to current technology realities, just as it has since the invention of fire.
The free market can sort all this out. If people feel that Apple’s iCloud is not secure enough, they’ll put their photos somewhere else. Or just keep them on their phones and computers. For enterprises, that same calculus exists, albeit on a much larger scale.
If a startup is going to bet its future and its investors’ capital on Amazon Web Services, then AWS better have terms of service that ensure reliability, security, privacy and compliance. If not, Google, Microsoft, IBM or Oracle will happily step in with their own cloud offerings.
And step in they will. Because no matter who is delivering the cloud, it is here to stay. It’s really too good a concept to go away. When it comes to the cloud, I’d say we’re entering inning No. 2 of a nine-inning ball game.Featured Image: Bryce Durbin