LLM ReferenceLLM Reference

Jamba Large 1.7

jamba-large-1.7

Open Source

About

Jamba Large 1.7 is AI21 Labs' latest hybrid Mamba-Transformer model offering improvements in grounding, instruction following, and structured output generation at 256K context.

Jamba Large 1.7 has a 256K-token context window.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyJamba 1.7
Released2026-02-01
Context256K
ArchitectureDecoder Only
Specializationgeneral
LicenseApache 2.0
Trainingpretrained

Created by

Developing AI for natural language understanding.

Tel Aviv, Israel
Founded 2017
Website