China is "the first to train a single generative AI model across multiple data centers" with an innovative mix of "non-sanctioned" GPUs forced by US import blocks on AI tech

CPU with Chinese flag concept
(Image credit: Getty Images | MF3d)

What you need to know

  • Tech industry analyst Patrick Moorhead claims that a single generative AI model is running across multiple data centers in China.
  • Rather than relying on a consistent array of matching GPUs, researchers in China are combining "non-sanctioned" units from various brands.
  • Splitting the workload of a single generative AI model across several locations could solve power limits synonymous with the technology.

Despite an ongoing saga of import restrictions and outright blocks deterring NVIDIA from shipping approximately $5 billion worth of AI chips, the state of generative AI in China doesn't seem to be slowing down. On the contrary, the country appears to be pooling whatever resources it has left after NVIDIA was blocked from selling its A800 and H800 AI and HPC GPUs in their local market and inventing clever ways to combine "non-sanctioned" hardware across multiple, separate data centers.

Tech industry analyst Patrick Moorhead claimed via X (formerly Twitter) that China is excelling with "lower-performing hardware" than what is available to generative AI developers in the United States and that it recently became "the first to train a single GAI model across multiple data centers." It comes with a pinch of salt as the source was "a very large company" during a conversation protected by an NDA (Non-Disclosure Agreement), but it would be a realistic solution to the gigantic electricity consumption seen in Microsoft and Google's AI efforts.

How is China pushing AI forward without the latest GPUs?

An AI server based on NVIDIA A100 technology revealed in 2021. (Image credit: Getty Images | Feature China)

Although the United States government's restrictions force NVIDIA to acquire licenses to ship its A100, A800, H100, and H800 GPUs designed explicitly for artificial intelligence computing, this hasn't halted China's generative AI efforts, as the country finds inventive and unusual workarounds. Primarily, a tactic to "meld GPUs from different brands into one training cluster" (via Tom's Hardware) keeps its researchers pushing ahead with whatever hardware is at hand.

NVIDIA might be the world's leading GPU manufacturer, but propping up its data centers with alternatives from brands like Huawei's "Ascend" AI range keeps China's efforts growing, even if at a slower rate than it would with the latest cutting-edge components.

Splitting these "melded" AI processing efforts across multiple data centers could be more of a solution to the worries that "there won't be enough power (for AI) by 2025" predicted by Elon Musk earlier this year, but undoubtedly signals the tremendous growth of generative AI and lends some credibility that superintelligence might only be "a few thousand days" away from well-equipped firms like Sam Altman's US-based OpenAI.

Above all, this tidbit proves that artificial intelligence is anything but a fad, regardless of its reception by the masses. China continues to expand into generative AI despite its restrictions, while Microsoft makes a $1.3 billion investment in Mexico as the West grows with practically unlimited access to NVIDIA's high-end AI GPUs. Whether China's researchers make any substantial gains by running a single model over data centers isn't clear just yet, but it's obvious that sanctions from the US haven't deterred them at all.

🎃The best early Black Friday deals🦃

Ben Wilson
Channel Editor

Ben is the channel editor for all things tech-related at Windows Central. That includes PCs, the components inside, and any accessory you can connect to a Windows desktop or Xbox console. Not restricted to one platform, he also has a keen interest in Valve's Steam Deck handheld and the Linux-based operating system inside. Fueling this career with coffee since 2021, you can usually find him behind one screen or another. Find him on Mastodon @trzomb@mastodon.online to ask questions or share opinions.