LLM Reference

Mixtral 8x22B Instruct v0.3

Open Source

About

Updated instruction-tuned Mixtral 8x22B MoE model from Mistral AI. Version 0.3 with improved instruction following and function calling capabilities.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Replicate API$2$2Serverless

Rankings

Specifications

FamilyMixtral
Released2024-07-01
Parameters8x22B
Context64K
ArchitectureMixture of Experts
Specializationgeneral
LicenseApache 2.0
Trainingfinetuning

Providers(1)