LLM Reference
NVIDIA NIM

StarCoder2 15B on NVIDIA NIM

StarCoder 2 · ServiceNow Research

Provisioned

Pricing

TypePrice (per 1M)
Input tokensFree
Output tokensFree

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About StarCoder2 15B

StarCoder2-15B is a sophisticated large language model, expertly crafted for code generation and understanding. Developed by the BigCode project, it features 15 billion parameters and is trained on The Stack v2, a vast dataset of over 4 trillion tokens from more than 600 programming languages. Its advanced transformer decoder architecture, equipped with a grouped-query and sliding window attention mechanism and a Fill-in-the-Middle training objective, allows a context window of 16,384 tokens. In addition to generating and completing code, the model excels in tasks like code summarization and retrieving relevant snippets through natural language queries. The training leveraged NVIDIA's NeMo framework and the Eos Supercomputer, while usage is governed by the BigCode Open RAIL-M license, supporting royalty-free and commercial use.

Get Started

Model Specs

Released2024-07-04
Parameters15B
Context8K
ArchitectureDecoder Only

Related Models on NVIDIA NIM