LLM ReferenceLLM Reference

Qwen2.5 Coder

AlibabaCoding
12 models2024From $0.03/1M input

About

The Qwen 2.5 Coder family is a sophisticated language model family designed for programming tasks and general computational reasoning. Developed with scalability in mind, these models range from 0.5 billion to 32 billion parameters, supporting extensive contexts up to 128,000 tokens. They demonstrate proficiency across 92 programming languages and excel in tasks like code generation, repair, and multi-language programming challenges. Remarkably, the 7-billion parameter variant outperforms much larger models like DeepSeek-Coder-V2-Lite on specific benchmarks, illustrating its efficiency and innovation. The family includes both base and instruction-tuned models. The instruction-tuned "Coder-Instruct" models enhance performance on various tasks and showcase superior generalization. These models are rigorously benchmarked on datasets such as McEval for multi-language programming and CRUXEval for reasoning, yielding exceptional results in code inference and mathematical tasks. The integration of diverse datasets maintains strong general capabilities, ensuring these models are versatile across technical and non-technical domains. Qwen 2.5 Coder is open-sourced under the Apache 2.0 license, encouraging community experimentation and deployment. The series' next iteration, with a 32-billion parameter model, is in development, promising even greater advancements in code intelligence. Practical applications, including code assistants and artifact generation tools, highlight its readiness for real-world scenarios, empowering developers with an accessible, powerful coding solution.

Specifications(12 models)

Qwen2.5 Coder model specifications comparison
ModelReleasedParametersStructured OutputsCode Exec
Qwen2.5 Coder 14B2024-1114BNoNo
Qwen2.5 Coder 14B Instruct2024-1114BNoNo
Qwen2.5 Coder 32B2024-1132BYesYes
Qwen2.5 Coder 32B Instruct2024-1132BYesYes
Qwen2.5 Coder 3B2024-113BNoNo
Qwen2.5 Coder 3B Instruct2024-113BNoNo
Qwen2.5 Coder 0.5B2024-110.5BNoNo
Qwen2.5 Coder 0.5B Instruct2024-110.5BNoNo
Qwen2.5 Coder 1.5B2024-091.54BNoNo
Qwen2.5 Coder 1.5B Instruct2024-091.54BNoNo
Qwen2.5 Coder 7B2024-097.61BNoNo
Qwen2.5 Coder 7B Instruct2024-097.61BYesNo

Available From(6 providers)

Pricing

Qwen2.5 Coder model pricing by provider
ModelProviderInput / 1MOutput / 1MType
Qwen2.5 Coder 7B InstructOpenRouter$0.03$0.09Serverless
Qwen2.5 Coder 1.5B InstructFireworks AI$0.1$0.1Serverless
Qwen2.5 Coder 3B InstructFireworks AI$0.1$0.1Serverless
Qwen2.5 Coder 32B InstructSiliconFlow$0.18$0.18Serverless
Qwen2.5 Coder 14B InstructFireworks AI$0.2$0.2Serverless
Qwen2.5 Coder 7B InstructFireworks AI$0.2$0.2Serverless
Qwen2.5 Coder 32B InstructArcee AI$0.4$1.2Serverless
Qwen2.5 Coder 32B InstructOpenRouter$0.66$1Serverless
Qwen2.5 Coder 32B InstructFireworks AI$0.9$0.9Serverless
Qwen2.5 Coder 32BFireworks AI$0.9$0.9Serverless
Qwen2.5 Coder 32BDeepInfra$20$20Serverless

Frequently Asked Questions

What is Qwen2.5 Coder?
The Qwen 2.5 Coder family is a sophisticated language model family designed for programming tasks and general computational reasoning. Developed with scalability in mind, these models range from 0.5 billion to 32 billion parameters, supporting extensive contexts up to 128,000 tokens. They demonstrate proficiency across 92 programming languages and excel in tasks like code generation, repair, and multi-language programming challenges. Remarkably, the 7-billion parameter variant outperforms much larger models like DeepSeek-Coder-V2-Lite on specific benchmarks, illustrating its efficiency and innovation. The family includes both base and instruction-tuned models. The instruction-tuned "Coder-Instruct" models enhance performance on various tasks and showcase superior generalization. These models are rigorously benchmarked on datasets such as McEval for multi-language programming and CRUXEval for reasoning, yielding exceptional results in code inference and mathematical tasks. The integration of diverse datasets maintains strong general capabilities, ensuring these models are versatile across technical and non-technical domains. Qwen 2.5 Coder is open-sourced under the Apache 2.0 license, encouraging community experimentation and deployment. The series' next iteration, with a 32-billion parameter model, is in development, promising even greater advancements in code intelligence. Practical applications, including code assistants and artifact generation tools, highlight its readiness for real-world scenarios, empowering developers with an accessible, powerful coding solution.
How many models are in the Qwen2.5 Coder family?
The Qwen2.5 Coder family contains 12 models.
What is the latest Qwen2.5 Coder model?
The latest model is Qwen2.5 Coder 14B, released in 2024-11.
How much does Qwen2.5 Coder cost?
Qwen2.5 Coder models range from $0.03/1M to $20/1M input tokens depending on the model and provider.

Models(12)