OpenAI Might Be Making An Superior AI Mannequin That Connects 10 Million NVIDIA GPUs Collectively

The race to safe AI dominance is on full steam with each software program and {hardware} corporations attempting to up each other, as such, NVIDIA & OpenAI might be engaged on an AI mannequin that may mix not hundreds however thousands and thousands of GPUs collectively.

OpenAI & NVIDIA Might Be Making An AI GPU Mannequin To Finish All Fashions

Thus far, NVIDIA and OpenAI have collaborated on ChatGPT whose newest GPT-4 mannequin makes use of a number of hundreds of AI GPUs from the chip large. It’s reported that NVIDIA has equipped round 20,000 of its brand-new AI GPUs to OpenAI which is able to additional be expanded within the coming months. However that’s simply the tip of the iceberg.

Based on Wang Xiaochuan, the businessman & founding father of the Chinese language search engine Sogou, it’s mentioned that ChatGPT is already engaged on a extra superior AI computing mannequin which makes use of much more superior coaching strategies. This mannequin is claimed to carry the capability to attach 10 million AI GPUs collectively.

Reaching even 100,000 GPUs feels like a giant deal however over a Million AI GPUs simply sounds approach too bold. However so as to add to his credibility, Wang himself has invested in a brand new intelligence agency generally known as Baichuan Intelligence for China which goals to be a first-rate competitor to OpenAI within the nation. It ought to be identified that the corporate just lately launched its Baichuan-13B language mannequin that’s able to working on consumer-level {hardware} akin to an NVIDIA GeForce RTX 3090 GPU. However the potential for OpenAI being a dominant power sooner or later can also be raised & as such, he proposes on his Weibo account that China desperately wants an OpenAI of its personal.

10 Million AI GPUs powering OpenAI’s future language fashions is astronomical when it comes to scale. Given the present capability of NVIDIA, the corporate can solely produce a Million AI GPUs so that’s going to take 10 years to understand the true imaginative and prescient of this large challenge.

However NVIDIA is working with TSMC to extend provide and the manufacturing can quickly enhance by an enormous issue although there are different issues that will turn out to be a hurdle akin to the associated fee, energy, and extra importantly, the means to interconnect such a lot of GPUs collectively. This may increasingly simply be a pipe dream but when we do get 1,000,000 AI GPU-powered system, that will be a real technical marvel.

Information Supply: MyDrivers

Share this story