China is “the first to train a single generative AI model across multiple data centers” with an innovative mix of “non-sanctioned” GPUs forced by US import blocks on AI tech

China is "the first to train a single generative AI model across multiple data centers" with an innovative mix of "non-sanctioned" GPUs forced by US import blocks on AI tech

What you need to know

  • Tech industry analyst Patrick Moorhead claims that a single generative AI model is running across multiple data centers in China.
  • Rather than relying on a consistent array of matching GPUs, researchers in China are combining “non-sanctioned” units from various brands.
  • Splitting the workload of a single generative AI model across several locations could solve power limits synonymous with the technology.

As a seasoned tech enthusiast with decades of industry insights under my belt, I must admit that this news about China’s innovative approach to generative AI development is nothing short of intriguing. Despite the ongoing trade restrictions and import bans, the Chinese tech sector seems to be finding creative ways to push the boundaries of AI research.


In the face of persistent hurdles such as import restrictions and outright bans that have prevented NVIDIA from delivering about $5 billion worth of AI chips, China’s advancements in generative AI seem undeterred. Instead, it appears that China is utilizing whatever remaining resources they have after NVIDIA was blocked from selling its A800 and H800 AI and HPC GPUs within their market, and they are coming up with innovative strategies to integrate “unsanctioned” hardware across multiple data centers.

At a recent gathering of tech enthusiasts, I learned an intriguing fact: China has mastered the art of setting up and managing AI training clusters that boast numerous nodes and less efficient hardware, compared to the US. What’s more, they were the pioneers in training a single GAI model across various data centers. This fascinating insight was shared by… (source omitted for brevity)

tech analyst Patrick Moorhead, via his platform (previously known as Twitter), suggested that China is outperforming U.S. developers in terms of lower-tier hardware for generative AI. He also stated that China was the first to train a single GAI model across multiple data centers. Although this information originated from a conversation within a company under a Non-Disclosure Agreement, it could potentially offer an explanation for the high electricity consumption observed in AI projects by Microsoft and Google.

How is China pushing AI forward without the latest GPUs?

China is "the first to train a single generative AI model across multiple data centers" with an innovative mix of "non-sanctioned" GPUs forced by US import blocks on AI tech

Despite the U.S. government’s regulations requiring NVIDIA to obtain licenses for shipping its A100, A800, H100, and H800 GPUs tailored for artificial intelligence processing, China’s advancements in generative AI have not been stalled. Instead, they’ve managed to find creative solutions like combining GPUs from various manufacturers into a single training cluster to continue their research using the available hardware resources. (Source: Tom’s Hardware)

While NVIDIA could be considered the top GPU manufacturer globally, integrating hardware from companies like Huawei’s “Ascend” AI series into their data centers helps China’s advancements in artificial intelligence continue, albeit at a pace that might be slower than what they could achieve with the most advanced, state-of-the-art components.

Dividing the collective AI processing tasks among various data centers could potentially address the concerns about power shortage for AI predicted by Elon Musk this year, as it suggests a possible scalability solution. However, this approach also underscores the rapid advancement of generative AI and hints at the possibility that superintelligence might be just “a few thousand days” away from well-established organizations like Sam Altman’s US-based OpenAI, implying an imminent future for AI technology.

In summary, this piece of information underscores the fact that artificial intelligence is not a fleeting trend, even if it’s not universally accepted. China persistently broadens its scope in generative AI, disregarding some restrictions. Meanwhile, Microsoft invests $1.3 billion in Mexico, reflecting the West’s expansion as they enjoy almost unlimited access to NVIDIA’s advanced AI GPUs. Whether China’s researchers will achieve significant advancements by running a single model across data centers remains uncertain, but it’s evident that US sanctions have not slowed them down one bit.

Read More

2024-09-30 17:39