LLM Reference

DBRX Base

Open Source

About

132B Mixture-of-Experts foundation model (36B active parameters) from Databricks Mosaic AI. Trained on 12 trillion tokens. Base (pre-trained) version.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

Rankings

Specifications

FamilyDBRX
Released2024-03-12
Parameters132B
Context32k
LicenseApache 2.0