Sponsored Content

How AI helps Domino’s predict when 3 billion pizzas are ready to go

by Zachary Fragoso, Manager, Data Science and AI, Domino’s

Zachary Fragoso is a manager of data science and AI at Domino’s in Ann Arbor, Michigan.

AI is ready for the enterprise. I know because I’m part of a team at Domino’s that’s delivering business results with the technology today.

I won’t claim it’s easy, but I do know success is out there for people ready to do the work needed to master this emerging approach to computing. I hope our experiences can inspire others to experiment with AI in their business, and figure out what works best for them.

Leveraging Docker and our NVIDIA DGX-1 cluster, Domino’s data science team has aligned on a standard way of building AI models and deploying APIs inside well-packaged virtual environments. Having a system to manage a model’s lifecycle greatly eased the transition from development to integration with our production platforms managed by our business partners. 

As a result of that work, Domino’s has been able to boost productivity. We train and iterate on AI models faster and get them deployed to inference systems at lightning speed, accelerating time to positive business outcomes.

The big challenge for us was integrating our AI-enabled microservices into our e-commerce and in-store platforms. The teams running these platforms have been keeping one of the largest global e-commerce sites running for decades.

Our data science team, while quite mature, is relatively new. So we had to learn about software development at scale. Our internal partners had to learn about model development cycles. With a little flexibility, both sides succeeded.

Scoring with AI at the Super Bowl

We have a diverse team at Domino’s, but we’re all united on the mission to make the customer experience and store operations better.

AI has helped us achieve this mission: by rewarding customers for ordering pizza, by giving them a better estimate of when their order will be ready, and by improving their phone ordering experience. Routing orders more efficiently even helps our drivers get more tips!

Points for Pie, Domino’s highest profile AI project to date, launched around last year’s Super Bowl. The concept allowed customers to snap a picture of the pizza eating, and in exchange we gave them loyalty points toward a free pizza from Dominos.

Part of the Domino’s data science team with one of its NVIDIA DGX-1 servers. (Photos courtesy of Domino’s)

The idea generated a lot of excitement within the organization early on, but no one was sure how to effectively recognize purchases and award points.

The data science team joined the conversation because we knew this could be a great AI application. We built a model using a deep learning neural network that classified pizza images.

We trained our model on an NVIDIA DGX system using more than 5,000 images of pizzas. A customer survey sent in response to the pictures helped automate some of the job of labeling the unique dataset. The system was even able to recognize unexpected images of plastic, dog toy pizzas.

The response to this campaign was overwhelmingly positive. We got a lot of press and massive redemptions of coupons, so we knew our customers were embracing the campaign.

Serving up a superior AI system

We acquired two NVIDIA DGX-1 systems to accommodate our growing team. Our server-based approach provides an affordable way to train large, complex models and share knowledge across the team. 

Our team started training models on Windows desktops with single NVIDIA GPUs, but we knew that wouldn’t scale. We quickly realized local development can be more expensive than a server-based approach for large data science teams. What’s more, developing AI models across desktops with different configurations makes collaboration very difficult.

Using a DGX-1, we’ve seen training time for a single model drop from multiple days to multiple hours. The ability to quickly iterate and train multiple models simultaneously is like going from an Easy-Bake Oven to a Domino’s industrial pizza oven!

We adopted a strategy of training and deploying models on-premises for two reasons. We feel it gives us tighter control over both security and how we dynamically scale our services.

Baking algorithms for better predictions

Making quick decisions is important when you need to deliver more than 3 billion pizzas a year — fast. So, Domino’s is exploring the use of AI for a host of applications, including more accurately predicting when an order will be ready.

We recently boosted accuracy from 75% to 95% for predictions of an order’s readiness. We used what we call a load-time model that factors in labor variables, order complexity and other operational factors.

The improvement has been well received and could be the basis for future ways to advance operator efficiencies and customer experiences, thanks in part to NVIDIA GPUs.

Domino’s does a very good job cataloging data in its stores. But until recently, we lacked the hardware to build a model large enough to handle all the load-time factors. At first, it took three days to train the load-time model, too long to make its use practical.

Once we had our DGX server, we could train an even more complicated model in less than an hour, thanks to a 72x speed-up. That let us iterate our model’s design very quickly, adding new data and improving the model, which is now in production in a version 3.0.

Accelerating predictions and queries

One of the next big steps for our team is tapping a bank of  to accelerate AI inferencing for all Domino’s tasks that involve real-time predictions.

Model latency is extremely important, so we’re building out an inference stack using T4 GPUs to host our AI models in production. We’ve already seen pretty extreme improvements, with latency down from 50 milliseconds to sub-10 ms.

Separately, we recently adopted BlazingSQL, open-source software to run data-science queries on GPUs. Migrating your work to a new platform takes some doing, but software eased the transition, supporting the APIs from a prior CPU-based tool while delivering better performance.

This new GPU-accelerated data science platform is delivering an average 10x speed-up across all use cases in the part of the AI process that involves building datasets. In the past, some of the data-cleaning and feature-engineering operations might have taken 24 hours. Now we do them in less than an hour.

Here at Domino’s we’re just getting started with AI. We’re having impact translating analytics insights into actionable items for our stores that are meaningful for our business. 

The most vital element in this work is collaboration. AI reaches its full potential when data science teams form a user group that shares Docker file specs, data connectors, and GPU management code. All of these things help the team put out quality products faster.

Collaboration should extend outside your company. From my experience, the experts at NVIDIA keep up with cutting-edge tools and methods and have a very good grasp of how companies are using their products. 

Hungry for more?

If I’ve managed to whet your appetite and you’d like to learn more, I encourage you to check out the presentation I’ll be giving about Domino’s experience with AI during NVIDIA’s GTC Digital conference. Registration for NVIDIA’s GTC Digital conference is FREE and will allow you to explore dozens of live webinars and libraries of on-demand content detailing the experiences of those on the front lines implementing AI and other cutting-edge technologies. 

As you see, I’m not alone. There’s a growing community of data scientists who know AI is ready to deliver real business successes. If you’re ready to join us, I’m confident AI can deliver results for your business.