What do you do when you rapidly become one of the most important chip manufacturers in the world and your stock price more than triples in a single year? For Nvidia, it means you throw a massive keynote stuffed with announcements that are setting the stage for a suite of products built around your core technology — building GPUs — that will make you the center of the conversation around artificial intelligence.
Nvidia has become the beneficiary of a major shift in the computational requirements for artificial intelligence and deep learning, which depend on GPUs and alternative processors to handle a different style of tasks. That technology is needed for anything from sifting through massive data piles to improving machine learning to power the sensing technology necessary for self-driving cars. And it means Nvidia has, along with Amazon, become a centerpiece of the technology industry in the span of a year or so — and at CES this year.
All of this can pretty much be summed up in a chart:
At CES, Nvidia — typically known for having more subdued events, at least according to my coworkers — threw one of the larger press conferences last night. Here are the headlines from the event:
A lot of these announcements are geared toward consumers — the Facebook Live announcement, for example, included a launch date for Mass Effect: Andromeda. But what’s more significant is that Nvidia has rapidly become a backbone for a type of compute power that developers are demanding as they need their apps to be smarter, faster and more personalized. They’re going to have to help sensors rapidly detect minute changes in the environment around them, like small changes in a person’s facial expression.
Beyond developers, Nvidia’s products are going to be in demand from auto manufacturers — like Audi — as auto makers find themselves in a race toward fully autonomous driving. The idea of riders being completely disconnected from the driving experience does seem like it’s a ways away (that’s what the Toyota Research Institution’s head said earlier in the day), but Nvidia has spent years with this kind of technology as its core competency, starting off from making cards that give gamers better performance.
All this pent-up demand can more or less be reflected in what’s happening with the growth in the company’s revenue. Here’s what we’ve seen over the past several quarters (they have an odd fiscal year calendar, so the dates may be a bit odd):
Amazon, too, has seen the need to incorporate GPU processing in its cloud services as developers flock to AWS and it becomes one of its most successful businesses. Amazon recently rolled out an option for developers to tap incremental GPU processing that their services need to operate, signaling that even lightweight developers will be needing that kind of technology as everyone races to include artificial intelligence to some extent within their services.
As large-scale companies continue to accumulate more and more data on their users’ activities, and activities like autonomous driving become more robust and abundant, there will be an even greater need for next-generation processors that can handle those kinds of computations.
And, indeed, it will be a race. Already there are startups popping up that are attracting intense investor interest. TechCrunch recently learned that Cerebras, a stealth startup with significant backing from Benchmark and others, is working on deep learning processors and raised around $25 million. And there’s no doubt that major manufacturers will be racing to produce the next generation of GPU processors.
Nvidia has spent much of its life being synonymous with gaming. The best computers would always include the best graphics cards in order to ensure that resource-intensive games run at smooth frame rates and are responsive. But now that the technology is in demand beyond just that initial use case — and it does seem like Nvidia is nowhere near abandoning that — Nvidia is finding itself the center of attention both at CES and beyond.