LLM Reference
DeepSeek Coder V2

DeepSeek Coder V2

DeepSeekCodingHighlight

About

DeepSeek Coder V2 is an open-source family of Mixture-of-Experts (MoE) code language models crafted specifically for code-related tasks. It builds on the advancements of the DeepSeek V2 model, featuring notable improvements in code-specific tasks and reasoning capabilities. These models were trained on an additional 6 trillion tokens, enhancing their skills in coding and mathematical reasoning while maintaining strong general language performance. Key enhancements include support for over 338 programming languages, a significant jump from the 86 supported in earlier iterations, and an increased context length of up to 128K tokens. The family offers a variety of models such as smaller "Lite" versions for projects with limited computational resources, and larger models for complex tasks. Available on Hugging Face, these models can be accessed through their API or chat interface, making them easily deployable across diverse coding environments 123.

Models(5)

Details

ResearcherDeepSeek
Models5