LLM Reference

DeciCoder 1B

About

DeciCoder 1B is a 1 billion-parameter, open-source large language model focused on code generation 12. It excels at efficient and accurate code completion for Python, Java, and JavaScript using a unique Grouped Query Attention architecture with a 2048-token context window 13. Trained on a substantial dataset with a Fill-in-the-Middle objective, it offers impressive throughput, especially when used with Deci's Infery LLM inference engine 79. Although capable of single or multi-line code completion, it may produce suboptimal results as it's not an instruction-following model. Performance is benchmarked on HumanEval, showing variable accuracy across languages, and it is available under the Apache 2.0 license 25.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Azure OpenAI
Provisioned

Specifications

FamilyDeciCoder
Parameters1B
ArchitectureDecoder Only
Specializationgeneral