CloudPhysics has a vision for the data center: It wants to give sysadmins clarity and insight into possible issues that create bottlenecks in a virtualized environment, offer solutions when they happen, and help prevent issues before they even happen — all as a cloud service.
And Jafco Ventures is giving the company $15 million in Series C funding to continue their work. They are joined by prior funders Kleiner Perkins Caufield & Byers and Mayfield Fund. The latest round brings their total funding to $27.5 million, more than doubling their previous contributions.
In addition to their funding announcement, the company also announced the release of the latest version of the CloudPhysics platform, which enables sysadmins to predictively troubleshoot storage problems in a virtualized network.
John Blumenthal, CloudPhysics CEO and co-founder, says the company has been working on the new release throughout this year, and they very deliberately chose storage because their data showed them that storage was a big pain point for their customers.
The beauty of the CloudPhysics approach, he says, is that it gives data center employees a single view of the problem. Instead of having to log on to multiple systems and try to piece the problem together, you can see the problem laid out visually with an explanation of what’s happening and what to do to fix it.
The company uses data from its entire network of customers and combines that with data it collects from each individual customer data center to build a big-data collection of information that it can then use to help make predictions about the data center behavior, presenting it in a visual manner (as in the example above).
Blumenthal says this approach generates a social aspect, too. Because of the highly visual nature of the information, it helps what have been traditionally disparate teams, such as storage and networking, see the nature of the problem and how to fix it — even in cases where they don’t really understand what the other team does.
It’s all laid out there for them, and Blumenthal believes that should go a long way toward improving communication and resolving problems in virtualized environments more quickly. And he predicts it will usher in age of the data center generalist. People will no longer be confined to narrowly defined areas of expertise because the data will provide them the information and instructions they need to proceed without understanding the nuances of each piece.
He said that the system can even help sysadmins decide if given their workloads and requirements, that investing in solid state drives will give them enough bang for the buck to justify the investment.
Blumenthal says that the data center is simply too complex and generates too much data today for humans to keep up without help from a system like his. “They are all emitting immense amounts of data. There is tons of data that exists. Without a model to understand how these layers come together, which none of the vendors have created, there isn’t a way to get ahead of it.”
To illustrate this, CloudPhysics claims to collect 140B samples a day, 50T to date and that’s only going to grow as they gather more customers, partners and third parties using the platform.
While Blumenthal says they are highly influenced by companies like New Relic and AppDynamics, and he likes what they are doing, he says he doesn’t necessarily see them as competitors because they are delivering information to developers higher up in the stack.
He says people like to compare them to Splunk and Sumo Logic, but he doesn’t think it’s an apt comparison either because those companies deal with unstructured data, index it and search against it, whereas CloudPhysics deals with structured data and presents information directly to the user.
Blumenthal said the storage piece is just the beginning and they hope to eventually have tools to analyze the entire data center. The platform is built to allow customers, partners and third parties to also build tools on top of it and he hopes that will happen over time.