announces zero emissions model building solution

As companies increasingly move to take advantage of machine learning to run their business more efficiently, the fact is that it takes an abundance of energy to build, test and run models in production., an early-stage full-stack MLOps solution, is building a greener approach.

Today, the company announced a zero emissions AI cloud solution in conjunction with Finnish cloud infrastructure partner atNorth.

The company says that atNorth is providing a Tier 3, ISO 27001-certified data center, running Nvidia A100-powered DGX and HGX systems. The data center offers 80MW of power capacity, which runs entirely on geothermal and hydro energy. What’s more, due to its location near the arctic, it provides essentially free cooling, giving an energy-efficient solution for customers to build machine learning models using its solution.

Company co-founder Max Prasolov says that they found after researching the problem, that computing and telecommunications is taking around 9% of total energy consumption worldwide, a number their research suggests could double in the next decade. They believe that machine learning model building will be a growing part of that, and they decided to team with atNorth to reduce their own carbon emissions footprint.

“We decided just to move all our operations and all our experiments to the zero emission cloud. And the goal is not to be carbon neutral because we [know that we could] buy credits and compensate for [our usage]. The question is how to [achieve] zero emissions? We [realized] we spent so much energy and so much computing power to train our models for clients, and we understand that is definitely the biggest carbon footprint we [are producing],” Praslov said.

While the company was at it, it came up with a way through its software solution to build models in a more efficient way, which in turn, reduces the amount of energy needed and enables the company to deliver an even more sustainable solution.

In terms of the product itself, the company is offering a flexible, cloud-native service where they provide some of the tooling, but leave enough room for companies to fill in with the pieces they think work best for them.

“The way that we approach it is instead of trying to build every single tool that needs to be built from data ingestion to monitoring to explainability to building pipeline engines, etc., we’re all about interoperability. We build what hasn’t been built, and we connect to the universe of Kubernetes-based tools that are already out there,” company co-founder Arthur McCallum explained.

The startup currently has a commercial solution, but it is working on an open source version of the stack which will be released shortly, probably before the end of the year. The company’s goal is to provide a cloud-based AI solution for smaller cloud vendors beyond the big three of Amazon, Microsoft and Google. This would include regional vendors across the world. launched in 2019 and came out with its first version of the solution last year. It has raised $2.3 million in seed funding so far, according to the company.