How Amazon EC2 grew from a notion into a foundational element of cloud computing

The beta launched 15 years ago this week

Fifteen years ago this week on August 25, 2006, AWS turned on the very first beta instance of EC2, its cloud-based virtual computers. Today cloud computing, and more specifically infrastructure as a service, is a staple of how businesses use computing, but at that moment it wasn’t a well known or widely understood concept.

The EC in EC2 stands for Elastic Compute, and that name was chosen deliberately. The idea was to provide as much compute power as you needed to do a job, then shut it down when you no longer needed it — making it flexible like an elastic band. The launch of EC2 in beta was preceded by the beta release of S3 storage six months earlier, and both services marked the starting point in AWS’ cloud infrastructure journey.

You really can’t overstate what Amazon was able to accomplish with these moves. It was able to anticipate an entirely different way of computing and create a market and a substantial side business in the process. It took vision to recognize what was coming and the courage to forge ahead and invest the resources necessary to make it happen, something that every business could learn from.

The AWS origin story is complex, but it was about bringing the IT power of the Amazon business to others. Amazon at the time was not the business it is today, but it was still rather substantial and still had to deal with massive fluctuations in traffic such as Black Friday when its website would be flooded with traffic for a short but sustained period of time. While the goal of an e-commerce site, and indeed every business, is attracting as many customers as possible, keeping the site up under such stress takes some doing and Amazon was learning how to do that well.

Those lessons and a desire to bring the company’s internal development processes under control would eventually lead to what we know today as Amazon Web Services, and that side business would help fuel a whole generation of startups. We spoke to Dave Brown, who is VP of EC2 today, and who helped build the first versions of the tech, to find out how this technological shift went down.

Sometimes you get a great notion

The genesis of the idea behind AWS started in the 2000 timeframe when the company began looking at creating a set of services to simplify how they produced software internally. Eventually, they developed a set of foundational services — compute, storage and database — that every developer could tap into.

But the idea of selling that set of services really began to take shape at an executive offsite at Jeff Bezos’ house in 2003. A 2016 TechCrunch article on the origins AWS described how that started to come together:

As the team worked, Jassy recalled, they realized they had also become quite good at running infrastructure services like compute, storage and database (due to those previously articulated internal requirements). What’s more, they had become highly skilled at running reliable, scalable, cost-effective data centers out of need. As a low-margin business like Amazon, they had to be as lean and efficient as possible.

They realized that those skills and abilities could translate into a side business that would eventually become AWS. It would take a while to put these initial ideas into action, but by December 2004, they had opened an engineering office in South Africa to begin building what would become EC2. As Brown explains it, the company was looking to expand outside of Seattle at the time, and Chris Pinkham, who was director in those days, hailed from South Africa and wanted to return home.

Brown was hired right around this time and went to work with Pinkham and a team of 14 people in a small office in Cape Town working on what was known at the time as the “Amazon Execution Engine.” Brown says that Amazon executives saw this as a potentially decent business, but nothing at the scale they are seeing today as they approach a $60 billion yearly run rate.

“When I joined, I think there were like 14 of us in the Cape Town office working on EC2, bringing in some of the very early preview customers at the time and telling us what they wanted. [ … ] We often had leaders come to the office and talk about how AWS could be a billion dollar business one day, and that just seemed so far away for us to ever get to that. It just seemed crazy,” he said.

Getting down to business

Brown says when it came to building a workable solution they just rolled up their sleeves and got down to it. As they do today, they spoke to customers and tried to give them what they need while iterating on their design as they went. It would take 21 months from the time the office opened to the time they launched the beta, and it had to be an enormous engineering challenge to take the technology that was available in that time period and turn it into flexible compute resources that were available with a credit card.

One way they did that was emulating what a rack could look like in the data center they were building in Virginia to house this project. This was long before any racks had been ordered or built, so they had to fake it. “In fact, in the early days of building EC2, we were iterating so quickly on various prototypes that we ended up mocking a data center rack with a stack of laptops in the corner of the Cape Town office,” he said.

Dave Brown, VP of EC2 at Amazon Web Services

Dave Brown, VP of EC2 at AWS. Image Credits: Amazon

While VMware had built its first hypervisor a couple of years earlier, the EC2 team decided to go with an open source product, the Xen hypervisor, that they found that was more flexible for their purposes, even though there were many challenges trying to make the virtualization work in an efficient way at the time. They would eventually move on from Xen a few years later, but it enabled them to get the product off the ground.

“We actually chose Xen as our hypervisor, and just used that as the underlying technology. There were a number of challenges with virtualization. Many people loved what it could do, but from a performance point of view, there were challenges with getting the same performance that you would get out of bare metal,” he said. Bare metal refers to a single tenant server running one operating system. With cloud computing, the idea mostly is having multiple tenants sharing various virtual machines with each one running its own operating system.

Expecting to scale

Over the almost two year period they were building EC2, they overcame a slew of challenges and eventually got a beta product up and running on August 25, 2006. While it was a big moment, there would be additional problems, especially when it came to scaling this idea, which was and remains one of the primary value props cloud computing.

“I always talk about this idea of ‘the illusion of infinite capacity,’ and what we’re shooting for in the cloud is that anytime a customer comes along and says I need a machine, we never say no to them. And that’s incredibly difficult to scale. And so it means that we have to have really good algorithms so they manage our supply chain really well, as well,” Brown said.

That was the goal, but it didn’t always go as planned, and they were often putting out fires in the early days. He tells the story of how they almost ran out of capacity early on and had to make use of a bunch of test machines to keep the operation going. “Peter DeSantis, who was leading our cloud businesses at the time (and today runs worldwide infrastructure for AWS), ran out of his office and asked everyone on the team to shut down any [test] instances we were using. We managed to find about 70 [available servers] and that was good enough to get us through the next few days when the next group of racks landed,” Brown explained.

They would get better and better at managing capacity of course, but the early days brought a constant set of problems to deal with and solve. It took two years to reach a million instances, which was a major milestone for the group. Just two months later, it reached 2 million and a couple of months after that, 3 million. Brown says they stopped counting after that, and today they deliver 60 million instances every single day.

They got the vision right and the technology matured along the way. Over time formidable competitors like Microsoft and Google eventually entered the fray. Brown considers himself fortunate to have been along for the entire 15 year journey, and to be part of a team building technology that had such a huge impact on computing.

Today, no startup launching is running their own infrastructure unless their solution has a unique requirement that demands it, and the flexibility of cloud infrastructure means these young companies can concentrate on their core competencies to build the business instead of racking and stacking servers and worrying about budgeting for resource capacity. That is a huge advantage and one that wasn’t really possible at scale prior to AWS starting this business.

AWS was first to market and has maintained that first mover advantage all these years. Today the company has around 33% market share, according to Synergy Research, a percentage that has remained remarkably steady over the years, even as the market keeps growing. In the most recent Amazon earnings report, AWS generated over $14 billion for the quarter, a far cry from the $1 billion the company was envisioning when it launched EC2 fifteen years ago.