AsianFin — Zhipu AI, one of China’s leading foundational model developers, launched its next-generation flagship model series GLM-4.5 on Sunday, as competition in the domestic large language model space intensifies.
Built on a Mixture of Experts architecture and optimized for AI Agent scenarios, GLM-4.5 has set new benchmarks among open-source models, outperforming key rivals in reasoning, coding, and agent intelligence. In overall global evaluations, GLM-4.5 ranks third worldwide, first among Chinese models, and first among open-source models, ahead of Stepverse’s Step-3, DeepSeek-R1-0528, and Moonshot’s Kimi K2.
The model series includes two variants: the full GLM-4.5, with 355 billion total parameters , and the lighter GLM-4.5-Air, with 106 billion parameters. Both are fully open-sourced via Hugging Face and Alibaba’s ModelScope, with APIs accessible through the Zhipu Open Platform. The complete feature set is available for free through Zhipu Qingyan and the z.ai official site.
“The road to AGI has only just begun,” CEO Zhang Peng said. “Current models are far from reaching human-level capability.”
Zhipu’s push into open-source comes as China’s LLM market undergoes rapid iteration. In the past month alone, the country has seen the release of MiniMax M2, Kimi K2, and Stepverse’s Step-3. Meanwhile, global heavyweight OpenAI is reportedly preparing to launch GPT-5—a closed-source, multimodal model—as early as late July.
Zhipu’s GLM-4.5 is pre-trained on 15 trillion tokens of general data and refined with 8 trillion tokens of specialized domain data focused on code, reasoning, and agents. The model is further enhanced with reinforcement learningtechniques for complex task execution. According to internal benchmarks, GLM-4.5 uses just 50% of the parameters of DeepSeek-R1 and one-third of those in Kimi-K2, while delivering superior performance in key LLM evaluation tests.
In real-world performance metrics—including 52 development tasks across software, game, and web development—GLM-4.5 delivered results comparable to Claude-4-Sonnet, while offering better tool invocation reliability and task completion rates.
The model’s token pricing is highly competitive, with input costs as low as RMB 0.8 per million tokens and RMB 2 for output—approximately one-tenth the cost of Anthropic’s Claude. Zhipu also claims the high-speed version of GLM-4.5 can generate over 100 tokens per second, supporting low-latency, high-concurrency environments for enterprise-grade deployment.
Zhipu, founded in 2019, is one of China’s earliest developers of large-scale pre-trained models. Since releasing its first ChatGLM model in March 2023, the company has iterated four times and launched over 20 AI products. By year-end 2023, Zhipu reported more than 2,000 ecosystem partners, 1,000 enterprise applications, and over 25 million userson its Qingyan platform. Paid features have helped Zhipu cross an ARR of over 10 million yuan.
On the funding side, Zhipu recently announced a RMB 1 billion strategic investment from Shanghai’s state-owned capital as it moves closer to a domestic IPO. Prior rounds included backing from Hangzhou Urban Investment, Shangcheng Capital, and Zhuhai Huafa, with a total raise exceeding RMB 10 billion. Zhipu’s investors now span top VCs such as Hillhouse, Qiming, and Legend Capital, alongside internet giants Alibaba, Meituan, Tencent, and Xiaomi.
The launch of GLM-4.5 also kicks off what the company calls its “Year of Open Source”, with plans to roll out a full suite of foundational, inference, multimodal, and agent models.
Zhipu’s ambitions underscore a broader trend in Chinas AI strategy—doubling down on open-source at a time when U.S. models increasingly tilt toward closed platforms. Analysts say this divergence could reshape the global LLM landscape.
“Open-sourcing domestic models injects fresh momentum into the AI ecosystem,” one industry insider told TMTPost. “It’s likely to trigger a new phase of global model realignment.”
Zhipu’s release coincided with another headline from rival Alibaba, which on Sunday introduced Tongyi Wanxiang 2.2, a cinematic-grade video generation model with more than 60 tunable visual parameters. Last week, Alibaba also unveiled Qwen 3, Qwen3-Reasoning, and Qwen3-Coder, strengthening its position across base, reasoning, and code-generation models.
Meanwhile, Stepverse’s Step-3, announced at the World Artificial Intelligence Conference , is the company’s first native multimodal model and boasts 321 billion parameters using MoE architecture—reflecting the industry-wide shift toward large, efficient multi-expert systems.
As the pace of innovation accelerates, the open-source release of GLM-4.5 marks a pivotal moment not only for Zhipu, but for Chinas LLM ambitions at large. With technical superiority, cost-efficiency, and ecosystem momentum, the company is positioning itself as a serious challenger—not just at home, but globally.
Worldcoin 在过去 24 小时内上涨 3.5%,未平仓合约量增长 5.5%,表明短期上涨势头强劲。在守住 ...
2 2月加密攻击损失降至4900万美元消息,据吴说区块链发推称:2月加密行业攻击损失约4900万美元,较1月大幅下降,其中Step Fi...
3 特朗普:要求伊朗清除可能已布设的水雷消息,美国总统特朗普表示:如果伊朗在霍尔木兹海峡布设了任何水雷尽管我们目前没有收到...
4 卡尔达诺价格预测:布林带收紧暗示ADA价卡尔达诺 显示出潜在的大幅价格波动迹象,其布林带收窄至数月以来的最低水平,预示着未来...
5 DeFi借贷平台Aave在价格故障后遭遇罕见的过去24小时内,Aave平台上约有2700万美元的头寸被清算,原因是该协议的CAPO风险预言机出现配置...
6 1.56亿USDT转入OKEx消息,据Whale Alert发推称:1.56 亿枚 USDT 从未知钱包转入 OKEx 交易所。...
7 投资公司 Multicoin 押注“互联网劳动力市在加密货币发展的大部分历史中,其主要用途一直是购买和交易代币。如今,一些投资者和开...
8 以太坊活跃度创新高但价格腰斩消息,据CryptoQuant发推称:以太坊网络活跃度创历史新高,活跃地址、转账和智能合约调用持续...
9 受特朗普伊朗威胁和美联储降息预期下降比特币价格上涨3%至71,255美元,但在地缘政治紧张局势和美联储利率预期变化的影响下,价格仍...
10 以太坊价格分析:ETH 连续 6 个月下跌——以太坊在连续六个月价格下跌后,目前正面临关键时刻,交易价格约为 2056 美元。分析师认为...
成都来彰科技 蜀ICP备2025134723号-1
资讯来源互联网,如有版权问题请联系管理员删除。