Nvidia is introducing a new top-of-the-line chip for AI work, the HGX H200. The new GPU upgrades the wildly in demand H100 with 1.4x more memory bandwidth and 1.8x more memory capacity, improving its ability to handle intensive generative AI work. The big question is whether companies will be able to get their hands on the new chips or whether they’ll be as supply constrained as the H100 — and Nvidia doesn’t quite have an answer for that.
Nvidia says cloud providers won’t need to make any changes as they add H200s into the mix. The cloud arms of Amazon, Google, Microsoft, and Oracle will be among the first to offer the new GPUs next year. Once they launch, the new chips are sure to be expensive. Nvidia doesn’t list how much they cost, but CNBC reports that the prior-generation H100s are estimated to sell for anywhere between $25,000 to $40,000 each, with thousands of them needed to operate at the highest levels.
Technology Technology Latest News, Technology Technology Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: MarketWatch - 🏆 3. / 97 Read more »
Source: MarketWatch - 🏆 3. / 97 Read more »
Source: verge - 🏆 94. / 67 Read more »
Source: FoxNews - 🏆 9. / 87 Read more »
Source: FoxNews - 🏆 9. / 87 Read more »
Source: verge - 🏆 94. / 67 Read more »