Snowflake and Databricks are putting the data stored in their services to work

Both companies are helping customers build generative AI apps

Snowflake and Databricks are surely similar companies. While each positions itself a bit differently, both provide data storage, processing and governance in a cloud context. Both are holding customer conferences this week, and both are looking for ways to help customers build generative AI and other intelligent applications on top of the data stored in these platforms.

If that wasn’t clear before, it became even more apparent this week when Databricks announced it was acquiring MosaicML for a cool $1.3 billion. That’s a lot of money for a startup, even a well-capitalized one like Databricks. The move came weeks after the company announced it was releasing Dolly, an open source LLM, and another acquisition in AI governance tool Okera.

Snowflake announced last month that it was buying Neeva, giving it a search tool and some high-end AI engineering talent. The company also bought Streamlit last year, which lets companies build applications from the data stored in Snowflake, and on Wednesday, it announced a new container service and partnership with Nvidia, giving customers a way to build generative AI applications and run them on Nvidia GPUs.

All of these moves (and others) are designed with one thing in mind: to use the data stored in these services as fuel for machine learning models, especially large language models. Both companies want to help customers take advantage of all this data stored on their platforms.

Nvidia’s VP of enterprise computing, Manuvir Das, speaking in the context of Wednesday’s partnership announcement with Snowflake, sees the move toward more practical use of the data as a logical progression for Snowflake.

“The fact that Snowflake is now moving in this next step where they’re saying, OK, not only can you keep your data here and do sort of the obvious data processing things on it, but this is the place where you can build all the applications that drive your company because your data is right here. That’s a very powerful thing,” Das told TechCrunch+.

Similarly, Databricks is increasingly seeing itself as a place where you can not only store data and do the various data tasks associated with that, but you can also be part of a whole data stack, where you build applications on top.

This week’s MosaicML acquisition was part of this broader strategy to put the data to work in an AI context, said Ray Wang, founder and principal analyst at Constellation Research. That’s something that was hard for Databricks to do, even with Dolly.

“The AI angle is all about making it easy to acquire, manage, train and deploy LLMs with ease,” Wang said.

Both companies are clearly moving hard toward AI through acquisitions, partnerships and product development. But what does that mean from a potential revenue perspective for the future of these companies, one of which is already public and one that surely will be there eventually?

Enterprise AI demand is not illusory

Databricks and Snowflake are both growing very quickly. The latest information from Databricks indicates that in its most recent fiscal year, it generated more than $1 billion in revenue, growing at more than 60%. Snowflake’s results are similarly impressive, posting $623.6 million in revenue in its most recent quarter, up 48% compared to the year-ago period.

Nigh vertical as those numbers are, as companies scale, their growth rate tends to decelerate. This is easy to understand: Imagine a business with $1 billion in ARR that adds $1 billion in ARR per year. It would grow at 100% in the first year, then 50% in the second and just 33% in the third. To maintain year-over-year growth rates, companies must expand how much revenue they add per year, a challenge that only gets harder over time. This is why Microsoft is growing in the single digits, and Salesforce is clinging to a 10% yearly growth rate like its life depends on it.

For Databricks and Snowflake, then, new revenue streams or the ability to charge more for their existing services is crucial to maintaining their growth rates; if they don’t find new things to sell, they could be repriced negatively by investors who see less future value in their businesses. While Databricks and Snowflake have managed impressive revenue growth to date, if they have a shot at a new, large package of stuff to sell, it’s worth betting on.

The question is therefore whether demand for generative AI products in the enterprise is real. To answer that question, we have to talk a little about how MoscaicML fits into its potentially new parent company. While Databricks has done work to build and release its own LLM, and has data storage, management and governance tools in place, it did not have a simple way to help customers to pre-train models for their own use. Databricks CEO Ali Ghodsi says that MoasicML fills a gap.

In his telling, LLMs straight out of the box aren’t too useful until they are tuned on a dataset. That work helps set the weighting of the models parameters, which are later fine-tuned to converse in English and other useful matters. But without a strong method of pre-training, the Databricks LLM stack was missing a key piece.

Snowflake is also crafting a vertical LLM stack inside of its own platform, which includes Streamlit and Neeva.

“The popularity of Streamlit to build front ends for machine learning models is through the roof,” said Christian Kleinerman, Snowflake’s senior VP of products. But a front end is just that — it needs guts behind it. Thanks to its partnership with Nvidia, Snowflake can offer its customers access to the NeMo framework inside of its own service, helping them build refined machine learning models based on that data.

Snowflake has data and governance sorted, it has NeMo to help its customers build their own model on their own information, and Streamlit to make the end product usable. Neeva provides the ability to search the data and a lot of AI smarts from the engineering team.

The issue, in Kleinerman’s view, is that while LLMs “demo very well,” they can also hallucinate. This is where the Neeva buy comes in: “What the Neeva team did that is remarkable is they were able to figure out how to combine traditional information retrieval technologies with LLMs,” combating the issue of large language models getting too creative in their answers.

Both companies’ purchases and partnerships as a group will make sense if by completing the missing links in their data stacks, they can sell generative AI tools to enterprise customers and generate lots of new income.

So, the question concerning the MosaicML deal and the related purchases by Snowflake is less about their individual purchase prices, or really even precisely where they fit into existing product matrices; we really need to know how serious the enterprise demand is for generative AI products powered by LLMs. That will tell us how well the individual bets will likely pay out.

It’s too soon to be entirely confident. But Ghodsi provided some useful color, saying that in the past half year, “literally” every conversation he has had with customers has “ended up being about generative AI,” with customers asking questions about whether they can have their own model, train it themselves, afford it and if the end product will be any good.

Ghodsi also underscored the point, saying that generative AI is now a CEO priority, comparing its rise in C-suite importance akin to that of data itself some years back. From his perch, Ghodsi doesn’t see the enterprise generative AI push slowing or devolving into a series of low-dollar tests at major companies that are used more as earnings-report fodder than a real tool.

Sometimes the best defense is an aggressive offense

If you are a company that helps customers store, manage and govern their data, you are in a place of potential power. By offering services like AI and ML tooling on top of customer data, Databricks and Snowflake are working to ensure that their data management offerings are not commoditized.

There’s probably something of an example in both Box and Dropbox regarding the need to do more than merely store and index corporate data. Both companies are doing lots of work to move past their storage roots, with a reasonable degree of success. Databricks and Snowflake are working to get that work done far in advance of their revenue growth slowing to similar levels, which means they have a lot more value in their share prices to throw around.

So, to avoid commoditization a deeper push into generative AI services makes sense, given what Ghodsi is seeing from his own customer base; companies want this, and they want to use their own data to train their own models. Presto, Snowflake and Databricks will become the LLM partner that their customers need, so they don’t have to go shopping elsewhere.

We’ll be able to track the impact of the two companies hammering into the LLM game. With Snowflake, we have regular earnings calls and financial disclosures. With Databricks, well, we have to wait a bit for those, but we’ll get them. For now it is clear that data-focused software companies expect the neo-AI market to be large and lucrative, and they are putting their time and money into the bet.