Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Market capitalization once broke 400 billion, what is the reason behind Zhipu's first financial report?
Source: Zhitu IPO, Text/Wang Fei
The first stock of large models, delivering its first annual report after listing.
On March 31, independent large model manufacturer Beijing Zhipu Huazhang Technology Co., Ltd. (hereinafter: Zhipu) released its 2025 annual report: revenue of 724 million yuan, up 131.9% year-over-year; net loss of 4.72B yuan for the year, an increase of 59.5% compared to the previous year; adjusted net loss of 3.18B yuan, up 29.1% year-over-year.
On April 1, Zhipu opened 15.00% higher at HKD 797.5, reaching a high of HKD 938 during trading, demonstrating the market’s recognition of it. By the close, Zhipu’s share price rose 31.94% to HKD 915, with a total market value of HKD 407.95B, compared to HKD 52.83B on the first day of trading, an increase of approximately 672.23%.
However, on April 2, Zhipu’s stock price began to retreat, closing down 14.86% at HKD 779, with a market value of HKD 347.31B, and a low of HKD 764 during trading.
Annual revenue of 720 million yuan, the largest domestic large model revenue scale
As an independent manufacturer focused on foundational model research and development, in 2021, Zhipu released China’s first proprietary pre-training large model framework, GLM framework, and launched its model-as-a-service (MaaS) product development and commercialization platform, mainly offering four types of models: language models, multimodal models, agent models, and code models, as well as integrated tools for model fine-tuning, deployment, and agent development. In 2022, Zhipu open-sourced its first 1000-billion-parameter model (GLM-130B).
With its original GLM (General Language Model) pre-training architecture, Zhipu built a full-stack model matrix covering language, code, multimodal, and agents, with models adapted to over 40 domestic chips.
As of June 2025, Zhipu has a research and development team of 657 members, with R&D personnel accounting for 74%. Its team members have backgrounds and experience in natural language processing, complex decision-making systems, multimodal semantic analysis, and related fields. The core scientific research team and academic advisors have published 500 top-tier influential papers, with over 58,000 citations. Supported by this, Zhipu’s model update frequency follows a hybrid rhythm of “base models every 3–6 months, specialized/open-source models more frequently.”
On April 2, Zhipu released its first native multimodal Coding base model, GLM-5V-Turbo. The model’s biggest breakthrough is its deep integration of visual and programming capabilities, capable of native processing of text, images, videos, and other multimodal information, while excelling at programming, long-term planning, and complex task execution.
According to reports, GLM-5V-Turbo achieved leading performance on core benchmarks such as multimodal coding and agents. It introduced visual capabilities while maintaining comparable text-only programming and reasoning abilities. The model is deeply adapted to Claude Code and the “Lobster” scenario, enabling OpenClaw Lobster to have true visual ability and understand on-screen information.
Based on the prospectus and latest annual report, from 2022 to 2025, Zhipu’s R&D investments were approximately 84.4 million, 529 million, 100B, and 3.18 billion yuan respectively, totaling nearly 6 billion yuan. Supported by this, Zhipu’s GLM series models are iterated every 3–6 months.
It is worth noting that Zhipu has achieved revenue doubling for four consecutive years.
From 2022 to 2025, Zhipu’s revenue was approximately 57.49 million, 125 million, 312 million, and 724 million yuan; gross profit was approximately 31.36 million, 80.48 million, 176 million, and 297 million yuan, with gross profit margins of 54.6%, 64.6%, 56.3%, and 41.0%; annual net losses were approximately 144 million, 788 million, 130B, and 2.19B yuan, with total accumulated losses around 8.6 billion yuan; adjusted net losses were approximately 97.42 million, 621 million, 2.96B, and 4.72B yuan.
With an annual revenue of 724 million yuan, Zhipu has become the largest domestic large model company by revenue. By comparison, in 2025, MiniMax’s revenue is approximately USD 2.47B (about 544 million yuan).
Specifically, in terms of deployment methods, in 2025, Zhipu’s cloud deployment revenue increased from 48.48 million yuan in 2024 to 190 million yuan, a 292.6% increase, mainly due to continuous iteration significantly improving the model’s intelligence ceiling. The enhanced model intelligence further drove increased model invocation; local deployment revenue grew from 264 million yuan in 2024 to 534 million yuan, a 102.3% increase.
By business line, revenue from Zhipu’s open platform and API, enterprise-level intelligent agents, and enterprise-level general large models all saw substantial growth. In 2025, these three segments generated revenues of 190 million, 166 million, and 366 million yuan respectively.
Quantity and price rise together, proposing the concept of Token architecture power
Thanks to the generational leadership of the GLM series in “intelligence ceiling” and extreme cost optimization in reasoning, Zhipu achieved a comprehensive explosion from developer ecosystem to globalization in 2025: MaaS API platform’s ARR reached about 1.7 billion yuan, a 60-fold increase year-over-year; business profitability improved significantly, with API gross margin rising nearly five times to 18.9%.
In February 2026, within 24 hours of the release of the flagship base model GLM-5, it was officially integrated into top platforms such as ByteDance TRAE, Alibaba Qoder, Tencent CodeBuddy, Meituan CatPaw, Kuaishou Wànqíng, Baidu Cloud, and WPS Office. Among China’s top ten internet companies, nine are deeply utilizing the GLM model. By March 2026, Zhipu’s registered enterprise and user count exceeded 4 million, serving over 218 countries and regions worldwide.
Additionally, in 2025, Zhipu became the first domestic company to launch the GLM Coding Plan. With its high-quality coding capabilities, the number of paid developers worldwide rapidly surpassed 242k, and token calls increased 15-fold in six months. By February 2026, even with a 30% price increase and the removal of first-time purchase discounts, the coding plan remained in high demand, becoming one of the fastest-growing AI coding services globally.
As the earliest domestic model manufacturer in the Agent field, from the world’s first mobile Agent AutoGLM to China’s first one-click install AutoClaw, Zhipu is defining the intelligent paradigm of agentic AI. In March 2026, after the Coding Plan, Zhipu launched Claw Plan, which reached 100k subscribers within two days and over 400k within 20 days.
In terms of globalization, Zhipu has realized token value monetization worldwide. Its models are widely deployed on global cloud platforms such as Google Vertex AI, AWS Bedrock, Fireworks, Cerebras, and have entered international model aggregation platforms like OpenRouter and Vercel, ranking No.1 among paid models on OpenRouter. The GLM has become the default model for well-known coding platforms (like Windsurf) and coding agent platforms (like OpenCode).
Notably, due to strong computing power demand, Zhipu proactively increased the price of its coding plan by 30% in February 2026, and its API call pricing increased by 83% compared to the end of last year.
Thanks to the outstanding model performance, after an 83% price increase, Zhipu’s API call volume did not decline but increased, showing a rare “quantity and price rise” trend, indicating customers’ high willingness to pay for more certain productivity. At the 2025 performance briefing, CEO Zhang Peng specifically mentioned that even after the price increase, the market remained in short supply, with invocation volume growing by 400%.
Regarding future development goals, Zhang Peng stated that Zhipu is not a traditional software company but a native intelligence laboratory with a belief in AGI. Its moat does not lie in stacking computing power but in the bottom-level deconstruction of the essence of intelligence and transforming this understanding into social productivity.
Looking ahead to 2026, the intelligent paradigm will evolve from lightweight VibeCoding (ambient programming) to industrial-grade Agentic Engineering (agent engineering), and further into autonomous planning, environmental perception, and self-iterative digital engineers. Ultimately, this will enable multi-step iteration and logical consistency in Long-horizon Tasks, further breaking through the intelligence ceiling and exponentially increasing token calls.
It is noteworthy that in this financial report, Zhipu also proposed a new concept: Token Architecture Capability (TAC)—“intelligent invocation volume × intelligent quality × economic conversion efficiency.” This is the result of reshaping core competitiveness after large models acquire long-horizon task execution in a closed-loop capability.
As enterprise TAC demand continues to grow, Zhipu’s MaaS platform is becoming the infrastructure connecting foundational models with industry applications. Zhang Peng believes that in the future, the standard for measuring individual or organizational value will no longer be how much information they hold, but their role as Token architects, building complex agent systems within given budgets and driving large models to autonomously operate these complex systems. Zhipu aims to become the infrastructure that enhances society’s overall TAC, turning every drop of Token into deliverable economic value.