Mixtral 8x7B
MistralAIApache 2.0
About
Mixtral 8x7B is a sparse mixture-of-experts language model from Mistral AI, offering strong performance at efficient inference cost.
Models(8)
Together AI Mixtral-8x7B-Instruct-v0.1
3276856000000000
Open Source
Together AI Nous-Hermes-2-Mixtral-8x7B-DPO
3276856000000000
Open Source
OctoML Mixtral-8x7B-Instruct-v0.1
3276856000000000
Open Source
OctoML Nous-Hermes-2-Mixtral-8x7B-DPO
3276856000000000
Open Source
Mixtral 8x7B Instruct v0.1 on AWS Bedrock
3276856000000000
Open Source
Groq Mixtral-8x7B-32768
3276856000000000
Open Source
Mixtral 8x7B Instruct v0.1 on IBM Watsonx
3276856000000000
Open Source
DeepInfra Mixtral 8x7B Instruct
3276856000000000
Open Source