CodeLlama 34B
codellama-34b
About
CodeLlama 34B is a powerful generative text model developed by Meta, specifically tailored for code synthesis and understanding. With 34 billion parameters, it excels in code completion, infilling, and instruction following, particularly for Python programming. The model utilizes an auto-regressive transformer architecture and has been trained on a diverse dataset of programming languages, making it versatile for various coding tasks. Designed for both commercial and research applications, CodeLlama 34B offers AI engineers a robust tool for integrating advanced code generation capabilities into their projects. More details can be found on the model's Hugging Face page .
CodeLlama 34B has a 100K-token context window.
CodeLlama 34B input tokens at $0.2/1M, output at $0.45/1M.
Capabilities
Providers(6)
Compare all →| Provider | Input (per 1M) | Output (per 1M) | Type | |
|---|---|---|---|---|
| Together AI | $0.8 | $0.8 | Serverless | |
| DeepInfra | $0.20 | $0.45 | Serverless | |
| Fireworks AI | $0.90 | $0.90 | Provisioned | |
| IBM watsonx | $1.8 | $1.8 | Serverless | |
| Microsoft Foundry | $1.54 | $1.77 | Provisioned | |
| Replicate API | $0.20 | $1.00 | Serverless |
Benchmark Scores(1)
| Benchmark | Score | Version | Source |
|---|---|---|---|
| Massive Multitask Language Understanding | 68.9 | 5-shot | Hugging Face Model Card |
Specifications
Created by
Large-scale open-source AI for social technologies.