LLM ReferenceLLM Reference

Stable Code 3B

stable-code-3b

About

Stable Code 3B, developed by Stability AI, is a 3-billion parameter large language model tailored for code completion. It offers competitive performance comparable to larger models like CodeLLaMA 7b while maintaining a smaller size, enabling it to run efficiently on typical laptops without a dedicated GPU. Beyond code completion, it supports features like "Fill in the Middle" capability and handling contexts up to 16,384 tokens, covering 18 programming languages. Its decoder-only transformer architecture, enhanced with Rotary Position Embeddings, relies on diverse open-source datasets. The model is available for free for research and non-commercial use, with commercial applications requiring a Stability AI Membership.

Stable Code 3B has a 16K-token context window.

Stable Code 3B input tokens at $0.1/1M, output at $0.1/1M.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Fireworks AI$0.10$0.10Provisioned

Rankings

Specifications

Released2023-08-14
Parameters3B
Context16K
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuned

Created by

Open-source generative AI models.

London, United Kingdom
Founded 2020
Website

Providers(1)