Modular secures $100M to build tools to optimize and create AI models

Modular, a startup creating a platform for developing and optimizing AI systems, has raised $100 million in a funding round led by General Catalyst with participation from GV (Google Ventures), SV Angel, Greylock and Factory.

Bringing Modular’s total raised to $130 million, the proceeds will be put toward product expansion, hardware support and the expansion of Modular’s programming language, Mojo, CEO Chris Lattner says.

“Because we operate in a deeply technical space that requires highly specialized expertise, we intend to use this funding to support the growth of our team,” Lattner said in an email interview with TechCrunch. “This funding will not be primarily spent on AI compute, but rather improving our core products and scaling to meet our incredible customer demand.”

Lattner, an ex-Googler, cofounded Palo Alto-based Modular in 2022 with Tim Davis, a former Google colleague in the tech giant’s Google Brain research division. Both Lattner and Davis felt that AI was being held back by an overly complicated and fragmented technical infrastructure, and founded Modular with a focus on removing the complexity of building and maintaining AI systems at large scale.

Modular provides an engine that tries to improve the inferencing performance of AI models on CPUs — and beginning later this year, GPUs — while delivering on cost savings. Compatible with existing cloud environments, machine learning frameworks like Google’s TensorFlow and Meta’s PyTorch and even other AI accelerator engines, Modular’s engine, currently in closed preview, lets developers import trained models and run them up to 7.5 times faster versus on their native frameworks, Lattner claims.

Modular’s other flagship product, Mojo, is a programming language that aims to combine the usability of Python with features like caching, adaptive compilation techniques and metaprogramming. Currently available in preview to “hundreds” of early adopters, Modular plans to release Mojo in general availability early next month.

“Our developer platform enables our customers, and the world’s developers, to defragment their AI technology stacks — pushing more innovations into production faster and realizing more value from their investment in AI,” Lattner said. “We’re attacking the complexity that slows AI development today by solving the fragmentation issues that plague the AI stack, starting with where AI software meets AI hardware.”

Ambitious much? Perhaps. But none of what roughly-70-employee Modular is proposing is out of the realm of possibility.

Deci, backed by Intel, is among the startups offering tech to make trained AI models more efficient — and performant. Another in that category is OctoML, which automatically optimizes, benchmarks and packages models for an array of different hardware.

In any case, to Lattner’s point, AI demand is fast approaching the limits of sustainability — making any tech to cut down on its compute requirements hugely desirable. The generative AI models in vogue today are 10 to 100 times bigger than older AI models, as a recent piece in The Wall Street Journal points out, and much of the public cloud infrastructure wasn’t built for running these systems — at least not at this scale.

It’s already had an impact. Microsoft is facing a shortage of the server hardware needed to run AI so severe that it might lead to service disruptions, the company warned in an earnings report. Meanwhile, the sky-high appetite for AI inferencing hardware — mainly GPUs — has driven GPU provider Nvidia’s market cap to $1 trillion. But Nvidia’s become a victim of its own success; the company’s best-performing AI chips are reportedly sold out until 2024.

For these reasons and others, more than half of AI decision makers in top companies report facing barriers to deploying the latest AI tools, according to a 2023 poll from S&P Global.

“The compute power needed for today’s AI programs is massive and unsustainable under the current model,” Lattner said. “We’re already seeing instances where there is not enough compute capacity to meet demand. Costs are skyrocketing and only the big, powerful tech companies have the resources to build these types of solutions. Modular solves this problem, and will allow for AI products and services to be powered in a way that is far more affordable, sustainable and accessible for any enterprise.”

Modular

Modular’s Mojo programming language, a “fast superset” of Python. Image Credits: Modular

That’s reasonable. But I’m less convinced that Modular can drive widespread adoption of its new programming language, Mojo, when Python is so entrenched in the machine learning community. According to one survey, as of 2020, 87% of data scientists used Python on a regular basis.

But Lattner argues that Mojo’s benefits will drive its growth.

“One thing that is commonly misunderstood about AI applications is that they are not just a high-performance accelerator problem,” he said. “AI today is an end-to-end data problem, which involves loading and transforming data, pre-processing, post-processing and networking. These auxiliary tasks are usually done in Python and C++, and only Modular’s approach with Mojo can bring all these components together to work in a single unified technology base without sacrificing performance and scalability.”

He might be right. The Modular community grew to more than 120,000 developers in the four months since Modular’s product keynote in early May, Lattner claims, and “leading tech companies” are already using the startup’s infrastructure, with 30,000 on the waitlist.

“The most important enemy of Modular is complexity: complexity in software layers that only work in special cases, software that’s tied to specific hardware and complexity driven by the low-level nature of high-performance accelerators,” he said. “The very thing that makes AI such a powerful and transformative technology is the reason it requires so much effort to reach scale, so much talent invested in building bespoke solutions and so much compute power to deliver consistent results. The Modular engine and Mojo together level the playing field, and this is just the start.”

And — at least from a funding standpoint — what an auspicious start it is.