LLM ReferenceLLM Reference

Jamba Mini 1.6

jamba-mini-1.6

Open Source

About

Jamba Mini 1.6 is AI21 Labs' hybrid SSM-Transformer MoE model with 12B active parameters (52B total) and a 256K context window, released March 6, 2025. Outperforms Ministral 8B, Llama 3.1 8B, and Command R7B on quality benchmarks. Supports function calling, structured JSON output, and grounded generation for enterprise applications.

Jamba Mini 1.6 has a 256K-token context window.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning

Rankings

Specifications

FamilyJamba 1.6
Released2025-03-06
Parameters52B (12B active)
Context256K
ArchitectureDecoder Only
Specializationgeneral
LicenseJamba Open Model
Trainingpretrained

Created by

Developing AI for natural language understanding.

Tel Aviv, Israel
Founded 2017
Website