LLM Reference

WizardCoder 33B

About

WizardCoder-33B-V1.1 is a cutting-edge large language model designed specifically for code generation tasks. Developed by the WizardLM team, it's based on the DeepSeek-Coder-33B-base model and employs the Evol-Instruct method, improving both code generation and comprehension. This model excels in generating code across multiple languages, enhancing workflows through automated code completion, and facilitating prototyping. It surpasses notable benchmarks such as HumanEval and MBPP, outperforming models like ChatGPT 3.5 in certain areas. The architecture uses a transformer design with advanced quantization methods to optimize performance across various hardware, and though some quality might be lost in quantized versions, it maintains a token context length of 16384. Proper use of system prompts ensures optimal results for this model, making it a premium tool for both educational and productivity-enhancing purposes.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Replicate API
Serverless

Specifications

Parameters33B
ArchitectureDecoder Only
Specializationgeneral