Nvidia’s New AI Targeted Chips To Cost Upto 40,000 USD Each
Huang disclosed that each unit of this next-generation AI powerhouse is projected to be priced within the range of $30,000 to $40,000, a significant investment indicative of the groundbreaking technology packed within.
Highlighting the immense effort invested in its development, Huang emphasized the necessity of pioneering novel technologies to bring Blackwell to fruition. Brandishing a Blackwell chip, he underscored the innovation behind it, revealing that Nvidia had allocated a staggering $10 billion towards research and development endeavors.
The anticipated price range places Blackwell in a similar bracket to its predecessor, the H100, affectionately known as the Hopper, which commanded prices ranging from $25,000 to $40,000 per chip, according to industry analysts. The Hopper series, introduced in 2022, marked a notable escalation in Nvidia’s AI chip pricing compared to its predecessors.
Nvidia has established a tradition of unveiling new generations of AI chips approximately every two years, with each iteration boasting enhanced speed and energy efficiency. Leveraging the momentum generated by the launch of a new generation, Nvidia capitalizes on the opportunity to secure orders for its latest GPUs. Blackwell, the latest addition to Nvidia’s repertoire, amalgamates two chips and boasts a larger physical footprint compared to its predecessors, promising even greater performance and efficiency.
The meteoric rise of Nvidia’s AI chips has been a driving force behind the company’s remarkable financial performance, with quarterly sales skyrocketing threefold since the onset of the AI renaissance in late 2022, catalyzed by the unveiling of OpenAI’s ChatGPT. A multitude of leading AI firms and developers have relied on Nvidia’s H100 for training their AI models, with Meta, for instance, publicly announcing plans to procure hundreds of thousands of Nvidia H100 GPUs.
Although Nvidia refrains from divulging the list prices of its chips, opting instead for a tiered pricing structure dependent on factors such as volume and procurement method, the company offers a diverse array of configurations catering to the specific needs of clients like Meta and Microsoft. Whether procured directly from Nvidia or through third-party vendors like Dell, HP, or Supermicro, which specialize in crafting AI servers housing multiple GPUs, the pricing and packaging options vary to accommodate the diverse demands of customers.
In a recent development, Nvidia unveiled three distinct variants of the Blackwell AI accelerator — namely the B100, B200, and GB200, with the latter featuring a pair of Blackwell GPUs coupled with an Arm-based CPU. Differing slightly in memory configurations, these variants are slated for release later this year, further expanding Nvidia’s offerings in the realm of AI acceleration.
Read: Nvidia On The Way To Overtake Apple As The Second Valuable Entity