DeepSeek’s breakout moment wasn’t China’s first open-source success. Alibaba’s Quan Lab had been releasing open-weight models for years. By September 2024, well before DeepSeek’s v3 launch, Alibaba was saying that global downloads had exceeded 600 million. On Hugging Face, Quen accounted for more than 30% of all model downloads in 2024. Other institutions, including the Beijing Academy of Artificial Intelligence and AI firm Beichuan, were also releasing open models as early as 2023.
But since the success of DeepSeek, this field has expanded rapidly. Companies like Z.ai (formerly Zhipu), MiniMax, Tencent, and a growing number of small labs have released models that compete on reasoning, coding, and agent-style tasks. The increasing number of capable models has accelerated progress. Capabilities that used to take months to arrive in the open-source world now appear within weeks, even days.
“Chinese AI firms have seen real benefits from open-source playbooks,” says Liu Zhiyuan, a computer science professor at Tsinghua University and chief scientist at AI startup ModelBest. “By releasing strong research, they build reputations and get free publicity.”
Beyond commercial incentives, open source has taken on cultural and strategic importance, Liu says. “In the Chinese programmer community, open source has become politically correct,” he says, framing it as a reaction to American dominance in proprietary AI systems.
That change is also reflected at the institutional level. Universities including Tsinghua have begun to encourage AI development and open-source contributions, while policymakers have moved to formalize those incentives. In August, China’s State Council released a draft policy encouraging universities to reward open-source work, proposing that student contributions on platforms like GitHub or Gitee could eventually count toward academic credit.
With increasing momentum and strong feedback loops, China’s push for an open-source model is likely to continue in the near term, though its long-term sustainability still depends on financial results, says Tiezhen Wang, who helps lead the work on global AI at Hugging Face. In January, model labs Z.ai and MiniMax went public in Hong Kong. “Right now, the focus is on making the pie bigger,” Wang says. “The next challenge is to figure out how each company secures its share.”
The next wave of models will be narrower and better
Chinese open-source models lead not only in download volume but also in diversity. Alibaba’s Quen has become one of the most diverse open model families in circulation, offering a wide range of variants adapted for different uses. The lineup ranges from lightweight models that can run on a single laptop, to large, multi-hundred billion-parameter systems designed for data-center deployments. Quen has several task-optimized variants created by the community: “Instruction” models are good at following commands, and “Code” variants specialize in coding.
Although this strategy is not unique to Chinese labs, Quen was the first open model family that offered so many high-quality options that it began to feel like a full product line – all free to use.
