Qwen2.5
About
The Qwen 2.5 large language model (LLM) family, developed by Alibaba Cloud's Qwen team, consists of decoder-only dense models that are open-sourced and come in seven different sizes ranging from 0.5 billion to 72 billion parameters 124. Built on a colossal dataset of up to 18 trillion tokens, these models showcase improvements from the Qwen 2 series, especially in knowledge, coding, and mathematical tasks 13. They boast enhanced capabilities in instruction following, long-text generation, and understanding structured data, including generating structured outputs like JSON 23. Other noteworthy features include improved system prompt handling for better role-playing and chatbot configuration 23. The models support over 29 languages, including Chinese and English, and include specialized versions like Qwen2.5-Coder and Qwen2.5-Math for specific tasks 236. Moreover, Qwen-Plus and Qwen-Turbo are accessible through Alibaba Cloud Model Studio APIs 3.
Specifications(17 models)
| Model | Released | Context | Parameters | Fn Calling | Tool Use | Structured Outputs |
|---|---|---|---|---|---|---|
| Qwen2.5-72B | 2025-10 | 128k | 72B | Yes | Yes | No |
| Qwen2.5-Max | 2025-01 | — | — | No | No | No |
| Qwen2.5-VL-72B | 2025-01 | 33K | 72B | No | No | No |
| Qwen2.5-0.5B | 2024-06 | 128K | 490M | No | No | No |
| Qwen2.5-0.5B-Instruct | 2024-06 | 128K | 490M | No | No | No |
| Qwen2.5-1.5B | 2024-06 | 128K | 1.54B | No | No | No |
| Qwen2.5-1.5B-Instruct | 2024-06 | 128K | 1.54B | No | No | No |
| Qwen2.5-3B | 2024-06 | 128K | 3.09B | No | No | No |
| Qwen2.5-3B-Instruct | 2024-06 | 128K | 3.09B | No | No | No |
| Qwen2.5-7B | 2024-06 | 128K | 7.61B | No | No | No |
| Qwen2.5-7B-Instruct | 2024-06 | 128K | 7.61B | No | No | Yes |
| Qwen2.5-14B | 2024-06 | 128K | 14.7B | No | No | No |
| Qwen2.5-14B-Instruct | 2024-06 | 128K | 14.7B | No | No | Yes |
| Qwen2.5-32B | 2024-06 | 128K | 32.5B | No | No | No |
| Qwen2.5-32B-Instruct | 2024-06 | 128K | 32.5B | No | No | Yes |
| Qwen2.5-72B | 2024-06 | 128K | 72.7B | No | No | No |
| Qwen2.5-72B-Instruct | 2024-06 | 128K | 72.7B | No | No | Yes |
Available From(10 providers)
Pricing
Frequently Asked Questions
- What is Qwen2.5?
- The Qwen 2.5 large language model (LLM) family, developed by Alibaba Cloud's Qwen team, consists of decoder-only dense models that are open-sourced and come in seven different sizes ranging from 0.5 billion to 72 billion parameters 124. Built on a colossal dataset of up to 18 trillion tokens, these models showcase improvements from the Qwen 2 series, especially in knowledge, coding, and mathematical tasks 13. They boast enhanced capabilities in instruction following, long-text generation, and understanding structured data, including generating structured outputs like JSON 23. Other noteworthy features include improved system prompt handling for better role-playing and chatbot configuration 23. The models support over 29 languages, including Chinese and English, and include specialized versions like Qwen2.5-Coder and Qwen2.5-Math for specific tasks 236. Moreover, Qwen-Plus and Qwen-Turbo are accessible through Alibaba Cloud Model Studio APIs 3.
- How many models are in the Qwen2.5 family?
- The Qwen2.5 family contains 17 models.
- What is the latest Qwen2.5 model?
- The latest model is Qwen2.5-72B, released in 2025-10.
- How much does Qwen2.5 cost?
- Qwen2.5 models range from $0.04/1M to $23/1M input tokens depending on the model and provider.
Models(17)
Qwen2.5-72B
Qwen2.5-Max
Qwen2.5-VL-72B
Qwen2.5-0.5B
Qwen2.5-0.5B-Instruct
Qwen2.5-1.5B
Qwen2.5-1.5B-Instruct
Qwen2.5-3B
Qwen2.5-3B-Instruct
Qwen2.5-7B
Qwen2.5-7B-Instruct
Qwen2.5-14B
Qwen2.5-14B-Instruct
Qwen2.5-32B
Qwen2.5-32B-Instruct
Qwen2.5-72B
Qwen2.5-72B-Instruct



