LLM Reference

Jais 13B

About

Jais 13B is an advanced, open-source bilingual large language model developed by Inception, MBZUAI, and Cerebras Systems, designed for Arabic and English language processing. It employs a transformer-based decoder-only architecture similar to GPT-3 and features 13 billion parameters. The model incorporates enhancements like SwiGLU non-linearity and ALiBi position embeddings to improve precision and context handling. Trained on a substantial dataset of 395 billion tokens—72 billion in Arabic and 279 billion in English/code—using the Condor Galaxy 1 supercomputer, Jais 13B outperforms other Arabic LLMs on benchmarks. Despite its capabilities, users must account for potential biases and limitations related to language proficiency, malicious use, and the handling of sensitive information.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
IBM watsonx$1.8$1.8
Serverless

Specifications

FamilyJais
Parameters13B
ArchitectureDecoder Only
Specializationgeneral