LLM Reference

Jais 30B

About

The Jais 30B is a leading bilingual large language model developed by Core42 and Cerebras Systems, designed for both Arabic and English. Featuring 30 billion parameters, it leverages a transformer-based decoder-only architecture, offering enhanced performance with SwiGLU non-linearity and ALiBi position embeddings. This model excels in text generation, conversational AI, summarization, and translation, capable of handling lengthy contexts with up to 8,192 tokens. Trained on an extensive dataset of 126 billion Arabic tokens, 251 billion English tokens, and 50 billion code tokens, Jais 30B is open-source under the Apache 2.0 license, facilitating community collaboration. Despite its capabilities, it may exhibit biases and is recommended to be used with human oversight.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Azure OpenAI$3.2$9.71
Serverless

Specifications

FamilyJais
Parameters30B
ArchitectureDecoder Only
Specializationgeneral