LLM Reference

Wizard Coder 15B

About

WizardCoder-15B is a large language model tailored for coding tasks, developed by the WizardLM Team. It builds on a transformer architecture and is fine-tuned from the StarCoder model using the Evol-Instruct method, which involves utilizing a synthetic dataset to enhance instruction-following abilities. With 15 billion parameters, it has achieved a notable 57.3 pass@1 score on the HumanEval benchmark, outperforming many other open-source coding LLMs. The model excels in code generation, completion, and review across various programming languages but may be limited by the input quality and its knowledge scope. Continuous research is underway to improve its performance and address inherent limitations and biases.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

Parameters15B
ArchitectureDecoder Only
Specializationgeneral