Blinded by the speed of change

My grandfather lived through an incredible period of technological change. He saw the invention of the automobile, the airplane and the rocket. He lived through the dawn of the atomic age and the mainframe computer. He didn’t live long enough to see the PC or the impact it would have on my professional life, but he was around for the creation of a lot of the technology that laid the foundation for what’s happening today.

I’ve been at this for a long time myself. I remember working on an early IBM PC. Later, I accessed the text-based internet via a 300-baud modem. I can recall the earliest days of the World Wide Web.

My first cell phone was a Motorola brick phone. My first iPhone was the 3. Bottom line is I’ve seen a lot of technological change, and I’ve never seen anything like we’re seeing these past months, weeks and days.

Consider for a moment that ChatGPT 3.5 took the world by storm in December. Last week, while I was on vacation, OpenAI released ChatGPT 4, which OpenAI unabashedly called “state of the art.” This week, we saw the announcement of plug-ins for the internet itself and useful tools like Expedia, WolframAlpha and so many others, suddenly accelerating generative AI in new and exciting directions.

All of this is happening with stunning speed. It feels like we’re living through an inflection point, much like we saw with the first IBM PC, the internet, the web, the iPhone. But this moment of change is happening so fast, we’ve barely got time to process the latest twist before the next iteration comes flying down the chute.

Like those moments we saw with the advent of personal computing, connected computing and mobile computing, you know that something huge is happening, but it’s not clear yet what it will become. At the moment, we know that there is an exciting new technology that can change the way we interact with computers, but we aren’t clear yet how that will play out, any more than we knew how the web or smartphones would transform our lives in ways we really couldn’t imagine in the earliest days.

On a panel last week led by Docker CEO Scott Johnston, Ilan Rabinovich, SVP of product and community at Datadog, talked about the similarities between what we’re seeing now and the early days of the internet.

“I feel like we’re like at that moment where the internet was just born, and they’re like, we have a network of interconnected machines, and we’ll share data, but what does that mean? Did we think that Amazon was going to be shipping me same-day boxes based on me clicking a button on a website? Probably not. And so now we have all of these industries that only exist because of the internet,” Rabinovich said.

But while the pace of change in AI feels especially quick today, it’s actually been built on years of prior effort. As one person pointed out to me, what we’re seeing now is actually the culmination of decades of research and development coming together with enough compute resources in a single moment where suddenly we can see the true potential of AI.

What’s different about this iteration is that it’s not just for businesses and scientists, it’s being delivered in a way that could have a direct impact on our daily lives in much the same way the World Wide Web and search engines did in the 1990s and early 2000s.

But the speed at which each new iteration is being rolled out to consumers could be an issue. Unlike the web, where the evolution moved along more slowly over years, generative AI is screaming along at a bonkers pace, and as we have seen, this technology is still very much a work in progress, no matter how fast, cool and slick it may feel at the moment.

We often hear we are in “the early days of X,” but never truer words were spoken about this technology. Even as this tech is blinding us with science, it’s worth keeping in mind that we have lots of questions to answer and many miles to go before generative AI is truly part of our daily routines, and this is especially true in the enterprise.

Speaking on the same Docker panel, Craig McLuckie, who helped develop Kubernetes and is currently entrepreneur in residence at Accel, says part of the challenge is going to be making this technology more accessible to more people and organizations.

“Training those models, running those models can be expensive, but the thing that we as a community could continue to focus on is really what can we do to reduce the barriers to entry so that it’s not only a very small number of exclusive organizations that have access to these incredibly powerful tools. How can we continue to democratize it both from an operational expertise perspective, but also just from a resource consumption perspective,” he said.

To give you a sense of how fast things are moving, Ali Ghodsi, CEO at Databricks, announced a low-cost, open-source ChatGPT-like model just last week that could help solve at least some of the problems that McLuckie was describing.

May Habib, who is co-founder and CEO at generative AI startup Writer, says that’s precisely what her company is trying to do. It’s attacking the writing problem by giving companies a set of tools they can use out of the box and train on internal data.

“The OpenAI API and ChatGPT help people imagine what’s possible, but to actually put it into production, get it accurate, get it using a company’s own information, in the company voice, and in their style and their formatting: They need [to do all the ground] work. And they’re not going to go out and hire a team of 20 [machine learning] engineers, computational linguists and researchers to be able to do that,” she said. Instead, they use a product like Writer to get them going with this technology, and that’s the opportunity for startups right now.

It’s going to take more tools like this one to get us there, especially in business. As Habib said, ChatGPT has opened our eyes to the possibilities of this technology, perhaps faster than we could have ever dreamed, but it’s fair to say that it’s not yet part of many workers’ daily routines.

That could change quickly, though, because the speed of technological advancement we are currently experiencing is truly remarkable, if we could just stop long enough to process it.