Jamba Large 1.6
jamba-large-1.6
Open Source
About
Jamba Large 1.6 is AI21 Labs' flagship hybrid SSM-Transformer MoE model with 94B active parameters (398B total) and a 256K context window, released March 6, 2025. Outperforms Mistral Large 2, Llama 3.3 70B, and Command R+ on quality benchmarks. Supports function calling, structured JSON output, and grounded generation for enterprise applications.
Jamba Large 1.6 has a 256K-token context window.
Capabilities
VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning