Nvidia launches H200 AI superchip: the first to be paired with 141GB of current HBM3e memory

Nvidia launches H200 AI superchip: the first to be paired with 141GB of current HBM3e memory - NVIDIA super chip - NVIDIA H2

Last updated 13 month ago

Hardware
Industry
nvidia
ai

Nvidia launches H200 AI superchip: the first to be paired with 141GB of current HBM3e memory



Why it matters: The generative AI race shows no signs and symptoms of slowing down, and Nvidia is trying to completely capitalize on it with the creation of a brand new AI superchip, the H200 Tensor Core GPU. The largest development while in comparison to its predecessor is the usage of HBM3e memory, which allows for greater density and better reminiscence bandwidth, each important factors in enhancing the speed of services like ChatGPT and Google Bard.

Nvidia this week added a brand new monster processing unit for AI workloads, the HGX H200. As the call shows, the new chip is the successor to the wildly popular H100 Tensor Core GPU that debuted in 2022 whilst the generative AI hype educate started out picking up speed.

Team Green introduced the brand new platform throughout the Supercomputing 2023 conference in Denver, Colorado. Based on the Hopper architecture, the H200 is expected to supply almost double the inference velocity of the H100 on Llama 2, that's a large language model (LLM) with 70 billion parameters. The H200 also offers around 1.6 times the inference velocity when using the GPT-3 version, which has a hundred seventy five billion parameters.

Some of those overall performance enhancements came from the architectural refinements, but Nvidia says it is also completed full-size optimization work on the software program front. This is meditated in the current release of open-supply software program libraries like TensorRT-LLM which could deliver up to 8 times greater overall performance and up to 6 instances decrease strength consumption when the usage of the today's LLMs for generative AI.

Another spotlight of the H200 platform is that it's the first to make use of fester spec, HBM3e memory. The new Tensor Core GPU's total reminiscence bandwidth is a whopping four.Eight terabytes in step with 2nd, a good bit faster than the 3.35 terabytes in step with 2d carried out by using the H100's reminiscence subsystem. The total memory potential has also increased from 80 GB at the H100 to 141 GB at the H200 platform.

Nvidia says the H200 is designed to be compatible with the equal systems that assist the H100 GPU. That stated, the H200 will be available in numerous form elements which includes HGX H200 server boards with 4 or eight-manner configurations or as a GH200 Grace Hopper Superchip where it is going to be paired with a effective 72-center Arm-based CPU at the same board. The GH200 will allow for up to 1.1 terabytes of combination high-bandwidth memory and 32 petaflops of FP8 overall performance for deep-learning applications.

Just just like the H100 GPU, the brand new Hopper superchip can be in high demand and command an eye fixed-watering fee. A single H100 sells for an anticipated $25,000 to $40,000 depending on order volume, and many groups within the AI space are buying them via the thousands. This is forcing smaller businesses to companion up simply to get limited get entry to to Nvidia's AI GPUs, and lead times do not seem to be getting any shorter as time goes on.

Speaking of lead instances, Nvidia is creating a massive earnings on every H100 sold, so it is even shifted some of the production from the RTX forty-series closer to making extra Hopper GPUs. Nvidia's Kristin Uchiyama says supply might not be an difficulty because the organization is continuously working on including greater manufacturing potential, but declined to provide greater information on the matter.

One element is for certain – Team Green is tons greater inquisitive about promoting AI-targeted GPUs, as income of Hopper chips make up an more and more big bite of its sales. It's even going to top notch lengths to expand and manufacture reduce-down variations of its A100 and H100 chips just to avoid US export controls and deliver them to Chinese tech giants. This makes it difficult to get too excited about the upcoming RTX 4000 Super images playing cards, as availability may be a massive contributing element closer to their retail price.

Microsoft Azure, Google Cloud, Amazon Web Services, and Oracle Cloud Infrastructure will be the first cloud providers to provide access to H200-based totally times starting in Q2 2024.

  • NVIDIA super chip

  • NVIDIA H200 release date

  • NVIDIA HBM3

  • GH200 memory bandwidth

  • HPCwire

  • Nvidia AI chip

  • Nvidia new chip

HP, Dell, MSI, and Asus all plan to unveil 4K 240Hz video display units in Q1 2024

HP, Dell, MSI, and Asus all plan to unveil 4K 240Hz video display units in Q1 2024

Highly predicted: Companies like Dell, Asus, and MSI have been teasing high-cease monitors they plan to unveil in early 2024. A new file shows that HP plans to expose a competing product as properly, combining an attrac...

Last updated 11 month ago

Samsung's upcoming 8TB transportable SSD is kind of slow and high priced, however what else will will let you suit 8TB to your pocket?

Samsung's upcoming 8TB transportable SSD is kind of slow and high priced, however what else will will let you suit 8TB to your pocket?

 Samsung sells TechSpot's favored transportable SSD (the T7), which currently received a successor that at the same time as steeply-priced, gives an outstanding pace enhance. A new upcoming option looks to alternate vel...

Last updated 13 month ago

Microsoft faces ability antitrust probe in UK over OpenAI dating

Microsoft faces ability antitrust probe in UK over OpenAI dating

What just occurred? Microsoft need to truely hate the United Kingdom's Competition and Markets Authority (CMA). The watchdog become the very last and maximum tough regulator Microsoft needed to appease in its $69 billio...

Last updated 12 month ago

AMD gains CPU market share in desktops, laptops, and servers

AMD gains CPU market share in desktops, laptops, and servers

What simply took place? AMD has reportedly gained CPU market percentage from its competitors over the past quarter across desktops, laptops and servers. The report comes from PC hardware marketplace studies firm Mercury...

Last updated 13 month ago

New Horizons probe enters low-hobby mode for extended mission into the Kuiper Belt

New Horizons probe enters low-hobby mode for extended mission into the Kuiper Belt

What's subsequent? Launched in 2006 as a part of NASA's New Frontiers application, the New Horizon probe finished its primary project with flying colors. After acting a flyby study of Pluto in 2015, New Horizon endured ...

Last updated 14 month ago

How to Choose an SSD on a Flash Sale

How to Choose an SSD on a Flash Sale

What SSD Should You Buy? Under regular instances, we might actually endorse you to observe our Best Storage manual. However, at some point of flash sales, inclusive of Black Friday or holiday intervals, guidelines based...

Last updated 12 month ago


safirsoft.com© 2023 All rights reserved

HOME | TERMS & CONDITIONS | PRIVACY POLICY | Contact