Nvidia had already sold tens of thousands of Blackwell GPUs before the next-generation architecture was announced.

General
Nvidia had already sold tens of thousands of Blackwell GPUs before the next-generation architecture was announced.

Nvidia unveiled its latest GPU architecture, Blackwell, offering many upgrades for AI inference and some hints at the next generation of gaming graphics cards. at the same time as the announcement at GTC, major tech companies also announced thousands of Blackwell-powered systems with Blackwell.

AWS, Amazon's data center division, announced that it will deploy Nvidia's Grace Blackwell Superchips (two Blackwell GPUs and a Grace CPU on a single board) in EX2. The company has also already agreed to provide 20,736 GB200 Superchips (a total of 41,472 Blackwell GPUs) to Project Ceiba, an AI supercomputer on AWS that Nvidia is using for internal AI R&D. Certainly, this is a roundabout way for Nvidia to buy its own products, but there are other examples that show how big the demand for this type of chip is right now.

Google has said it is jumping on the Blackwell train and will offer Blackwell in its own cloud services, such as GB200's NVL72 system. Each consists of 72 Blackwell GPUs and 36 CPUs. The "well-funded Google" The number of Blackwell GPUs that Google has contracted for is not yet known, but given the company's ability to compete with OpenAI on AI systems, it will probably be quite a few.

Oracle, famous for Java for those of a certain age, or more recently Oracle Cloud, has indicated the exact number of Blackwell GPUs it will purchase from Nvidia. In total, that's 40,000 Blackwell GPUs. A portion of Oracle's order will be earmarked for Oracle's OCI Supercluster or OCI Compute.

Microsoft has been tight-lipped about the exact number of Blackwell chips it will buy, but it is putting a lot of money into OpenAI and its own AI efforts on Windows (with mixed reactions), so expect big money here. Microsoft is trying to bring Blackwell to Azure. [Details on Blackwell's rollout and availability are not yet known; Nvidia sold large quantities of the architecture as soon as it was announced, as far as we know. The exact release date and even the chip specifications are still up in the air. As is usual with enterprise chips like this, we don't even have the white paper in hand, and hundreds of thousands of GPUs - including Meta, OpenAI, xAI, and others cited in Nvidia's GTC announcement - have already been sold to the highest bidder.

This level of demand for Nvidia's chips is not entirely surprising. As just one example, Meta announced earlier this year that it was aiming for 350,000 H100s by the end of the year. However, one wonders how Blackwell and B200 will affect Meta's estimates; it will be some time before Nvidia begins full production of Blackwell, and even longer to meet the needs of the exponentially growing AI market.

Categories