Internet giant Google unveiled its AI Hypercomputer at its Next’25 event. This advanced supercomputer system is specifically designed to simplify AI implementations, improve performance, and optimize costs.
Ironwood TPUs, among other things, provide the AI Hypercomputer’s computing power. Google’s seventh-generation TPU (Tensor Processing Unit) is the largest and most powerful to date, and according to the internet giant, it is ten times better than the previous high-performance TPU.
42.5 ExaFLOPS
With over 9,000 chips per pod, Ironwood delivers 42.5 ExaFLOPS computing power per pod. According to Google, this makes the system 24 times more powerful than the fastest supercomputer today—at least regarding AI computing power. The tech giant says this should meet the exponentially growing demands of the most advanced thinking models, such as Gemini 2.5.
The configuration comes with extensive GPU options. Think AI hardware like the A4 and A4X VMs powered by Nvidia B200 and GB200 GPUs. Google will also reportedly be the first cloud provider to offer Nvidia’s next-generation Vera Rubin GPUs (named after an American astronomer).
Google Distributed Cloud
Another announcement was about Google Distributed Cloud (GDC), which brings Google models to on-premises environments. “We are working with Nvidia to make Gemini available on their Blackwell systems, with Dell as a key partner. This allows Gemini to be deployed locally, in both air-gapped and connected environments,” it said.
The speed of Top500 supercomputers and AI supercomputing systems is determined differently, making direct comparison difficult. The first category concerns 64-bit Floating Point (FP64) calculations via Linpack benchmarks. For AI supercomputing, such as Google’s Ironwood TPUs, we are talking about FP8 (or sometimes even FP4) calculations. These are less demanding on such compute infrastructure (intended to train AI efficiently) and unsuitable for calculation tasks in which the Top500 systems excel, such as weather forecasting or simulations and scientific research.