LLM Reference
Jamba 1.5

Jamba 1.5

AI21 Labs
Highlight

About

The Jamba 1.5 family from AI21 Labs stands out with its hybrid SSM-Transformer architecture, which blends the strengths of Mamba and Transformer designs to optimize both performance and efficiency. It includes two versions—Mini and Large—featuring 12B and 94B active parameters respectively. A notable feature is the models' extensive context window of 256K tokens, enabling exceptional long-context handling, speed, and quality that surpasses others in the same size class. Designed for enterprise use, these models offer developer-friendly features such as function calling, structured JSON output, and document processing capabilities. They support a wide range of languages, including English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew. Accessible on platforms like Hugging Face, Jamba 1.5 is crafted to cater to diverse enterprise applications 411.

Models(2)

Details

ResearcherAI21 Labs
Models2