LLM ReferenceLLM Reference

StarCoder2 15B

Deprecated
Open Source

About

StarCoder2-15B is a sophisticated large language model, expertly crafted for code generation and understanding. Developed by the BigCode project, it features 15 billion parameters and is trained on The Stack v2, a vast dataset of over 4 trillion tokens from more than 600 programming languages. Its advanced transformer decoder architecture, equipped with a grouped-query and sliding window attention mechanism and a Fill-in-the-Middle training objective, allows a context window of 16,384 tokens. In addition to generating and completing code, the model excels in tasks like code summarization and retrieving relevant snippets through natural language queries. The training leveraged NVIDIA's NeMo framework and the Eos Supercomputer, while usage is governed by the BigCode Open RAIL-M license, supporting royalty-free and commercial use.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(3)

Compare all →
ProviderInput (per 1M)Output (per 1M)Type
Fireworks AI$0.20$0.20Provisioned
DeepInfra$0.20$0.60Serverless
NVIDIA NIMProvisioned

Benchmark Scores(4)

BenchmarkScoreVersionSource
Google-Proof Q&A54.3diamondresearch
HellaSwag91.710-shotresearch
HumanEval82.4pass@1research
Massive Multitask Language Understanding79.85-shotresearch

Rankings

Specifications

Released2024-07-04
Parameters15B
Context8K
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Empowering responsible AI for efficient workflows

Santa Clara, California, United States
Founded 2003
Website