Another day, another delay. This time it looks like it's Nvidia's Blackwell AI GPUs that may have fallen victim to last minute delay-itus, as reports now suggest they'll be launching in the first half of 2025 instead.
While engineering samples for the have already been delivered, it seems enterprise customers including Microsoft, Meta and xAI might have to wait for their orders. According to , two anonymous sources who worked on the chips first released news of the pushback, which was later allegedly corroborated with an anonymous Microsoft source by .
People keep asking about Blackwell delaysWe already sent out the our update 2 weeks ago, July 22nd to our hyperscaler, semiconductor, and investor clientsSell side and media are quite lateThey still have multiple things wrong tooVolumes, ASP, etc belowhttps://t.co/frahauIDRt pic.twitter.com/Oap8CSdZBr
Whatever the cause, any significant delay in shipping could potentially shake confidence in Nvidia's ability to deliver its costly AI-processing GPUs en masse and on time.
That being said, Nvidia are unlikely to be worried that any significant delay will result in lost customers. While AMD has its own competing AI GPU, the Instinct MI300, AMD CEO Dr. Lisa Su has recently indicated that , and that "the overall supply chain is tight and will remain tight through 2025."
[[link]]
: The top chips from Intel and AMD.
: The right boards.
: Your perfect pixel-pusher awaits.
: Get into the game ahead of the rest.
Given Nvidia's [[link]] dominance in the sector and the reported of the Blackwell series AI GPUs, it's unlikely any major customer will switch over to other hardware providers in the face of a potential delay, especially if the top candidates are in the same short supply boat.
Once again it seems, , and if it needs to reconfigure to keep up with demand, the rest of the world's top tech companies may just have to stand in line and wait. Or perhaps there really are design issues with Nvidia's latest AI hardware holding things back.
Still, at least it doesn't seem to be , ey?