Byte-Sized
NVIDIA Blackwell GPUs Ship to Major Cloud Providers
The next generation of AI training and inference hardware begins rolling out to hyperscalers.
2026-02-06
NVIDIA has begun shipping its Blackwell B200 GPUs to major cloud providers including AWS, Azure, and Google Cloud. The Blackwell architecture offers 4x the training performance and 30x the inference performance compared to Hopper H100 GPUs, while using significantly less power per operation. This should reduce the cost of training and running large AI models, potentially making advanced AI capabilities more accessible to smaller organizations.