LLM ReferenceLLM Reference

Jamba Large 1.6

jamba-large-1.6

Open Source

About

Jamba Large 1.6 is AI21 Labs' flagship hybrid SSM-Transformer MoE model with 94B active parameters (398B total) and a 256K context window, released March 6, 2025. Outperforms Mistral Large 2, Llama 3.3 70B, and Command R+ on quality benchmarks. Supports function calling, structured JSON output, and grounded generation for enterprise applications.

Jamba Large 1.6 has a 256K-token context window.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning

Rankings

Specifications

FamilyJamba 1.6
Released2025-03-06
Parameters398B (94B active)
Context256K
ArchitectureDecoder Only
Specializationgeneral
LicenseJamba Open Model
Trainingpretrained

Created by

Developing AI for natural language understanding.

Tel Aviv, Israel
Founded 2017
Website