Pricing
| Type | Price (per 1M) |
|---|---|
| Input tokens | $0.10 |
| Output tokens | $0.10 |
Capabilities
VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution
About Stable Code 3B
Stable Code 3B, developed by Stability AI, is a 3-billion parameter large language model tailored for code completion. It offers competitive performance comparable to larger models like CodeLLaMA 7b while maintaining a smaller size, enabling it to run efficiently on typical laptops without a dedicated GPU. Beyond code completion, it supports features like "Fill in the Middle" capability and handling contexts up to 16,384 tokens, covering 18 programming languages. Its decoder-only transformer architecture, enhanced with Rotary Position Embeddings, relies on diverse open-source datasets. The model is available for free for research and non-commercial use, with commercial applications requiring a Stability AI Membership.
Model Specs
Released2023-08-14
Parameters3B
Context16K
ArchitectureDecoder Only