LLM ReferenceLLM Reference
Together AI

Mixtral 8x7B Instruct v0.1 on Together AI

Mixtral · MistralAI

ServerlessOpen Source

Why use Mixtral 8x7B Instruct v0.1 on Together AI?

Together AI offers Mixtral 8x7B Instruct v0.1 with pay-as-you-go pricing at $0.40/1M input tokens. Together AI is a platform for running open-source and proprietary LLMs with fast serverless and dedicated endpoints at competitive inference pricing.

Compare Mixtral 8x7B Instruct v0.1 across 5 providers to find the best fit for your use case

Compare Mixtral 8x7B Instruct v0.1 Across Providers

ProviderInput (per 1M)Output (per 1M)
Together AI$0.40$0.40
OctoML (Deprecated)$0.40$0.60
AWS Bedrock$0.45$0.45
IBM watsonx$0.18$0.18
DeepInfra$0.15$0.45

Pricing

TypePrice (per 1M)
Input tokens$0.40
Output tokens$0.40

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

About Mixtral 8x7B Instruct v0.1

Mixtral 8x7B Instruct v0.1 via AWS Bedrock

FAQ

What does Mixtral 8x7B Instruct v0.1 cost on Together AI?

On Together AI, Mixtral 8x7B Instruct v0.1 costs $0.4 per 1M input tokens and $0.4 per 1M output tokens.

What is the context window for Mixtral 8x7B Instruct v0.1 on Together AI?

Mixtral 8x7B Instruct v0.1 supports a 32,768 token context window on Together AI.

How does Together AI compare to other Mixtral 8x7B Instruct v0.1 providers?

Mixtral 8x7B Instruct v0.1 is available from 5 providers. The cheapest input pricing is $0.15/1M tokens from DeepInfra.

Who created Mixtral 8x7B Instruct v0.1?

Mixtral 8x7B Instruct v0.1 was created by MistralAI as part of the Mixtral model family.

Is Mixtral 8x7B Instruct v0.1 open source?

Mixtral 8x7B Instruct v0.1 is open source according to the seed data.

Get Started

Model Specs

Released2023-12-10
Parameters56B
Context33K
ArchitectureDecoder Only
Knowledge cutoff2023-12

Related Models on Together AI