Gate News reports that on March 12, WIRED discovered from NVIDIA’s 2025 SEC financial documents that the company plans to invest $26 billion over the next five years to develop open-weight AI models, confirmed by executives. Bryan Catanzaro, Vice President of Deep Learning Research at NVIDIA, stated, “Helping the ecosystem grow aligns with our interests. We are an American company, but we collaborate with companies worldwide to ensure the ecosystem is diverse and strong everywhere.”
NVIDIA’s main motivation for this move is to counter China’s leading position in open-source models. Currently, nearly all of the world’s top open-weight models originate from China, including DeepSeek, Alibaba Qwen, Moonshot AI, Z.ai, MiniMax, and others. Many overseas startups and researchers have built applications on these Chinese models. Industry rumors suggest DeepSeek is about to release a new model trained entirely on Huawei chips, which, if true, would prove that “top-tier models can be trained without NVIDIA.”
On the same day, NVIDIA announced its most powerful open-weight model to date, Nemotron 3 Super, with 128 billion parameters, comparable to OpenAI’s GPT-OSS largest version. The company claims this model scored 37 on the AI Index (a comprehensive score across 10 benchmark tests), surpassing GPT-OSS’s 33 but falling short of several Chinese models. NVIDIA also stated that Nemotron 3 Super ranks first on PinchBench, a new benchmark for evaluating a model’s control over OpenClaw. Additionally, the company has completed pretraining a 550-billion-parameter model. Nathan Lambert, head of the Allen Artificial Intelligence Institute’s ATOM project, said he is a “loyal fan of Nemotron” and called on the U.S. government to also fund open-source models.