GLM-5
About
Flagship open-weight foundation model from Zhipu AI with 744B parameters (40B active per token) in Mixture of Experts architecture. Trained on 28.5T tokens using DeepSeek Sparse Attention on Huawei Ascend hardware. Achieves state-of-the-art performance on coding and agentic benchmarks (SWE-bench Verified: 77.8%). Supports autonomous planning, multi-step tool use, and self-correction.
Capabilities
VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution
Providers(5)
Compare all →| Provider | Input (per 1M) | Output (per 1M) | Type | |
|---|---|---|---|---|
| Fireworks AI | $1.00 | $3.20 | Serverless | |
| OpenRouter | $0.72 | $2.3 | Serverless | |
| Together AI | $1 | $3.2 | Serverless | |
| GCP Vertex AI | $1 | $3.2 | Serverless | |
| NVIDIA NIM | — | — | Serverless |
Benchmark Scores(1)
| Benchmark | Score | Version | Source |
|---|---|---|---|
| SWE-bench Pro | 38.6 | — | DAT-1778 |