IBM Goes Deep With Comprehensive Cloud Initiative

IBM has been setting its sights on helping customers manage the hybrid cloud world, and today it announced a series of initiatives it hopes will make it simpler to manage a broad range of resources by treating them as a single entity.

The hybrid cloud is a term used to describe a mixed computing environment that can include any combination of public cloud resources, private clouds and on-premises data centers.

Angel Diaz, VP of cloud architecture and technology at IBM says the idea is to make it as easy as possible to manage resources across a range of platforms and cloud types. That means between public and private clouds, data centers and even cloud to cloud with the ultimate goal of having it behave as single infrastructure, regardless of location.

It is attempting to address an issue every company is having as it shifts to the cloud. Most companies today have a mix of computing types in a hybrid approach. These companies face challenges pulling data from these disparate sources, sharing it across settings while making it all work together.

IBM has broken down these issues into three main problem areas, Diaz explained. First it needs to find ways to integrate data with company systems of record. Next, it wants to make it easier to access data, regardless of the system or platform and pull it together wherever it’s needed. Finally, with applications residing in different places across clouds and on-premises locations, often around the world, it wants to help deliver the data to the application whenever and wherever it’s needed.

That’s a complex set of problems and IBM announced several ways it intends to solve them.

The first point of attack involves container technology. It has partnered with Docker to create IBM-defined Dockerized containers for the enterprise. It’s designed these containers to work whether the applications are on-premises or in the cloud and to integrate into existing processes such as security, governance and systems of record.

The next piece is called IBM DataWorks, which has been designed to help developers work with a disparate set of data sources by mapping the connections between applications and locations to pull the data together in a secure way automatically (based on the mapping).

IBM plans to tie this data piece into Watson to provide access to APIs to use the data in an intelligent fashion, not unlike what Microsoft announced last week with the Azure Machine Learning platform. Diaz tried to make a distinction between Watson and the Microsoft product, but there are similarities. “What we can do with Watson is not just makes sense of the data, but also draw correlations and give [users] probability about what is correct and what isn’t.”

There are other pieces in play here, but one other important one is BlueMix Local, which provides a way to set up BlueMix, the IBM platform as a service offering, in a way that lets you pick and choose where you store the application components. Instead of just the public cloud, you can store the application across a mix of on-premises and cloud resources, depending on a company’s individual requirements.

Diaz says with BlueMix Local, IBM is attempting to provide “visibility, control and  intent of application in seamless way.”

As with any IBM set of tools, it’s broad and includes a range of partners. Unlike the old days when you bought everything from IBM, the world has changed dramatically and sometimes even companies and projects that might seem to compete with IBM are working together in this mix.

Diaz says today, companies have to work together to overcome the growing complexity customers are facing. Nobody can do it alone, he says. “No single vendor can innovate everything and people who believe that are hallucinating,” he said.

IBM is hoping by teaming up with a broad set of partners and offering a deep set of tools, it can help solve a complex set of problems. Time will tell if it works.