LLM Reference

DeepSeek Coder 7B V1.5

About

DeepSeek Coder 7B Base V1.5 is a large language model tailored for code generation and related tasks, part of the advanced DeepSeek Coder series. It is distinguished by its proficiency in code completion, generation, and understanding across multiple programming languages. Trained on a dataset of 2 trillion tokens with 87% coding content, it employs the Llama architecture to achieve high performance on coding benchmarks. The model features a 16K token context window, enabling complex project-level code handling, and supports both English and Chinese, enhancing its multilingual capabilities. Available under a permissive license, it is suitable for both research and commercial use.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(2)

ProviderInput (per 1M)Output (per 1M)Type
Alibaba Cloud PAI-EAS
Serverless
Fireworks AI Platform
Provisioned

Specifications

Released2024-02-04
Parameters7B
ArchitectureDecoder Only
Specializationcode