LLM Reference
Fireworks AI

StarCoder2 7B on Fireworks AI

StarCoder 2 · ServiceNow Research

ServerlessProvisioned

Pricing

TypePrice (per 1M)
Input tokens$0.20
Output tokens$0.20

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About StarCoder2 7B

The StarCoder2 7B model is a large language model engineered for code generation tasks, containing 7 billion parameters. It is trained on a vast dataset of 3.5 trillion tokens known as The Stack v2, which includes diverse code samples from 17 programming languages such as Python, Java, and JavaScript. This model features Grouped Query Attention with a significant context window of 16,384 tokens and a sliding window attention of 4,096 tokens, enabling it to adeptly manage complex coding tasks. StarCoder2 7B excels in code completion, summarization, and generating code snippets from given prompts. It is built with responsible data usage in mind, designed to avoid directly copying copyrighted code, making it a reliable tool for developers 124.

Get Started

Model Specs

Released2024-07-04
Parameters7B
Context8K
ArchitectureDecoder Only

Related Models on Fireworks AI