CodeLlama 34B
About
CodeLlama 34B is a powerful generative text model developed by Meta, specifically tailored for code synthesis and understanding. With 34 billion parameters, it excels in code completion, infilling, and instruction following, particularly for Python programming. The model utilizes an auto-regressive transformer architecture and has been trained on a diverse dataset of programming languages, making it versatile for various coding tasks. Designed for both commercial and research applications, CodeLlama 34B offers AI engineers a robust tool for integrating advanced code generation capabilities into their projects. More details can be found on the model's Hugging Face page .
Capabilities
MultimodalFunction CallingTool UseJSON Mode
Providers(6)
| Provider | Input (per 1M) | Output (per 1M) | Type | |
|---|---|---|---|---|
| Replicate API | — | — | Serverless | |
| Together AI API | $0.8 | $0.8 | Serverless | |
| deepinfra API | — | — | Serverless | |
| Fireworks AI Platform | — | — | Provisioned | |
| IBM watsonx | $1.8 | $1.8 | Serverless | |
| Azure OpenAI | — | — | Provisioned |
Benchmark Scores(1)
| Benchmark | Score | Version | Source |
|---|---|---|---|
| Massive Multitask Language Understanding | 68.9 | 5-shot | Hugging Face Model Card |
Specifications
FamilyCode Llama
Released2023-08-24
Parameters34B
Context100K
ArchitectureDecoder Only
Specializationgeneral