LLM ReferenceLLM Reference
Microsoft Foundry

CodeLlama 13B on Microsoft Foundry

Code Llama · AI at Meta

ProvisionedOpen Source

Get Started with CodeLlama 13B on Microsoft Foundry

Microsoft Foundry offers access to CodeLlama 13B with a 100K context window. Microsoft Foundry is a unified enterprise AI platform that significantly expands beyond Azure OpenAI. It functions as a multi-provider hosting and deployment platform for LLMs, supporting models from OpenAI, Anthropic, DeepSeek, xAI, Meta, Mistral, NVIDIA, and others. Foundry integrates agent services, evaluation, observability, and governance into a single Azure control plane. Key capabilities include a multi-provider model catalog, Model Router for intelligent prompt routing, Foundry Agent Service for building and deploying AI agents with built-in tracing and monitoring, and enterprise-grade governance with RBAC, compliance, and regional deployments. For broader model catalog including Claude, DeepSeek, Grok, Llama, Mistral, and NVIDIA Nemotron, Foundry is the recommended platform over Azure OpenAI.

Pricing

TypePrice (per 1M)
Input tokens$0.81
Output tokens$0.94

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

About CodeLlama 13B

CodeLlama 13B is a state-of-the-art generative text model developed by Meta, specifically designed for code synthesis and understanding tasks. Released on August 24, 2023, this 13-billion-parameter model excels in general code generation and comprehension, making it suitable for a wide range of programming tasks, including code completion, infilling, and instruction following. It utilizes an optimized transformer architecture and has been trained on a diverse dataset similar to Llama 2, ensuring robust understanding of programming languages and coding practices. AI engineers can integrate CodeLlama 13B into various coding environments and tools for both commercial and research applications, leveraging its powerful capabilities to enhance productivity and streamline the coding process.