LLM Reference

Stable Code 3B

About

Stable Code 3B, developed by Stability AI, is a 3-billion parameter large language model tailored for code completion. It offers competitive performance comparable to larger models like CodeLLaMA 7b while maintaining a smaller size, enabling it to run efficiently on typical laptops without a dedicated GPU. Beyond code completion, it supports features like "Fill in the Middle" capability and handling contexts up to 16,384 tokens, covering 18 programming languages. Its decoder-only transformer architecture, enhanced with Rotary Position Embeddings, relies on diverse open-source datasets. The model is available for free for research and non-commercial use, with commercial applications requiring a Stability AI Membership.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Fireworks AI PlatformProvisioned

Specifications

Parameters3B
Context16K
ArchitectureDecoder Only
Specializationgeneral