Sections

ideals
Business Essentials for Professionals



Companies
14/11/2023

Nvidia Updates Its Flagship Chip To Support More Complex AI Systems




Nvidia Updates Its Flagship Chip To Support More Complex AI Systems
Nvidia declared that it has enhanced the capabilities of its top-tier AI processor and that it will begin to ship the new product alongside Amazon.com, Google, and Oracle early in the next year.
 
The chip, dubbed the H200, will surpass Nvidia's top-tier H100 chip at present. More high-bandwidth memory, one of the most expensive components of the processor that determines how much data it can process rapidly, is the main improvement.
 
In addition to powering OpenAI's ChatGPT service and numerous other generative AI services that provide human-like writing responses to queries, Nvidia is the industry leader in AI chips. Such services will be able to respond faster because to the addition of extra high-bandwidth memory and a quicker link to the chip's processing units.
 
High-bandwidth memory in the H200 is 141 gigabytes, compared to 80 gigabytes in the H100. The memory vendors for the new chip were not disclosed by Nvidia; however, Micron Technology (MU.O) announced in September that it was in the process of becoming an Nvidia supplier.
 
Additionally, Nvidia purchases memory from SK Hynix (000660.KS), a Korean company that claimed last month that AI chips were boosting sales.
 
Along with specialised AI cloud providers CoreWeave, Lambda, and Vultr, Nvidia announced on Wednesday that Amazon Web Services, Google Cloud, Microsoft Azure, and Oracle Cloud Infrastructure will be among the first cloud service providers to offer access to H200 chips.
 
(Source:www.usnews.com)

Christopher J. Mitchell

Markets | Companies | M&A | Innovation | People | Management | Lifestyle | World | Misc