
Qwen2.5
About
The Qwen 2.5 large language model (LLM) family, developed by Alibaba Cloud's Qwen team, consists of decoder-only dense models that are open-sourced and come in seven different sizes ranging from 0.5 billion to 72 billion parameters 124. Built on a colossal dataset of up to 18 trillion tokens, these models showcase improvements from the Qwen 2 series, especially in knowledge, coding, and mathematical tasks 13. They boast enhanced capabilities in instruction following, long-text generation, and understanding structured data, including generating structured outputs like JSON 23. Other noteworthy features include improved system prompt handling for better role-playing and chatbot configuration 23. The models support over 29 languages, including Chinese and English, and include specialized versions like Qwen2.5-Coder and Qwen2.5-Math for specific tasks 236. Moreover, Qwen-Plus and Qwen-Turbo are accessible through Alibaba Cloud Model Studio APIs 3.
Models(15)
Qwen2.5 0.5B
Qwen2.5 0.5B Instruct
Qwen2.5 1.5B
Qwen2.5 1.5B Instruct
Qwen2.5 3B
Qwen2.5 3B Instruct
Qwen2.5 7B
Qwen2.5 7B Instruct
Qwen2.5 14B
Qwen2.5 14B Instruct
Qwen2.5 32B
Qwen2.5 32B Instruct
Qwen2.5 72B
Qwen2.5 72B Instruct
Qwen2.5 Max