TPU

Google announces the Cloud TPU v5p, its most powerful AI accelerator yet

Google today announced the launch of its new Gemini large language model (LLM) and with that, the company also launched its new Cloud TPU v5p, an updated version of its Cloud TPU v5e, which launched i

Google Cloud announces the 5th generation of its custom TPUs

At Cloud Next, its annual user conference, Google Cloud today announced the launch of the fifth generation of its tensor processing units (TPUs) for AI training and inferencing. Google announced the f

Google launches a 9 exaflop cluster of Cloud TPU v4 pods into public preview

At its I/O developer conference, Google today announced the public preview of a full cluster of Google Cloud’s new Cloud TPU v4 Pods. Google’s fourth iteration of its Tensor Processing Uni

Google launches the next generation of its custom AI chips

At its I/O developer conference, Google today announced the next generation of its custom Tensor Processing Units (TPU) AI chips. This is the fourth generation of these chips, which Google says are tw

Google’s newest Cloud TPU Pods feature over 1,000 TPUs

Google today announced that its second- and third-generation Cloud TPU Pods — its scalable cloud-based supercomputers with up to 1,000 of its custom Tensor Processing Units — are now publi

Google is making a fast specialized TPU chip for edge devices and a suite of services to support it

In a pretty substantial move into trying to own the entire AI stack, Google today announced that it will be rolling out a version of its Tensor Processing Unit — a custom chip optimized for its mach

Google announces a new generation for its TPU machine learning hardware

As the war for creating customized AI hardware heats up, Google announced at Google I/O 2018 that is rolling out out its third generation of silicon, the Tensor Processor Unit 3.0. Google CEO Sundar P

Google says its custom machine learning chips are often 15-30x faster than GPUs and CPUs

It’s no secret that Google has developed its own custom chips to accelerate its machine learning algorithms. The company first revealed those chips, called Tensor Processing Units (TPUs), at its