LLM ReferenceLLM Reference

Mixtral 8x7B SlimOrca

About

Mixtral 8x7B mixture-of-experts tuned with SlimOrca for improved performance.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyOpenOrca
Released2023-12-11
Parameters56B
Context32K
ArchitectureDecoder Only
Knowledge cutoff2023-12
Specializationgeneral
Trainingfinetuning

Created by

Human-centered approach to safe AI

N/A
Founded N/A
Website