StarCoder2 15B
About
StarCoder2-15B is a sophisticated large language model, expertly crafted for code generation and understanding. Developed by the BigCode project, it features 15 billion parameters and is trained on The Stack v2, a vast dataset of over 4 trillion tokens from more than 600 programming languages. Its advanced transformer decoder architecture, equipped with a grouped-query and sliding window attention mechanism and a Fill-in-the-Middle training objective, allows a context window of 16,384 tokens. In addition to generating and completing code, the model excels in tasks like code summarization and retrieving relevant snippets through natural language queries. The training leveraged NVIDIA's NeMo framework and the Eos Supercomputer, while usage is governed by the BigCode Open RAIL-M license, supporting royalty-free and commercial use.
Capabilities
Providers(3)
| Provider | Input (per 1M) | Output (per 1M) | Type | |
|---|---|---|---|---|
| Fireworks AI Platform | — | — | Provisioned | |
| deepinfra API | — | — | Serverless | |
| NVIDIA NIM | — | — | Provisioned |