Why use Mixtral 8x7B on Vultr?
Vultr offers Mixtral 8x7B with pay-as-you-go pricing at $0.55/1M input tokens. Vultr is a cloud infrastructure company headquartered in West Palm Beach, Florida.
Compare Mixtral 8x7B across 19 providers to find the best fit for your use caseCompare Mixtral 8x7B Across Providers
| Provider | Input (per 1M) | Output (per 1M) |
|---|---|---|
| Databricks Foundation Model Serving | $0.50 | $1.00 |
| NVIDIA NIM | — | — |
| GCP Vertex AI | $0.40 | $1.20 |
| AWS Bedrock | $0.45 | $0.70 |
| OctoAI API (Deprecated) | $0.45 | $0.45 |
Pricing
| Type | Price (per 1M) |
|---|---|
| Input tokens | $0.55 |
| Output tokens | $2.75 |
Capabilities
About Mixtral 8x7B
Mixtral 8x7B, developed by Mistral AI, features a cutting-edge Mixture of Experts (MoE) architecture, utilizing eight experts with seven billion parameters each, yielding a total of 46.7 billion parameters. This architecture activates only two experts per token, allowing for efficient processing and a 6x faster inference rate compared to Llama 2 70B. The model excels in performance, surpassing Llama 2 70B and competing with GPT-3.5 on numerous benchmarks. It supports multiple languages and can handle context up to 32,000 tokens, enhancing understanding of lengthy text. Designed for diverse tasks, it is strong in code generation and available under a permissive Apache 2.0 license, promoting community engagement. Compatible with various optimization tools, its weights are easily deployable, with Mistral AI continuing to improve its capabilities through performance optimizations and fine-tuning efforts.
FAQ
What does Mixtral 8x7B cost on Vultr?
On Vultr, Mixtral 8x7B costs $0.55 per 1M input tokens and $2.75 per 1M output tokens.
What is the context window for Mixtral 8x7B on Vultr?
Mixtral 8x7B supports a 32,000 token context window on Vultr.
How does Vultr compare to other Mixtral 8x7B providers?
Mixtral 8x7B is available from 19 providers. The cheapest input pricing is $0.15/1M tokens from Mistral AI Studio.
What API model ID do I use for Mixtral 8x7B on Vultr?
Use the model ID mixtral-8x7b when calling Vultr's API.
Who created Mixtral 8x7B?
Mixtral 8x7B was created by MistralAI as part of the Mixtral model family.
Is Mixtral 8x7B open source?
Mixtral 8x7B is open source under Apache 2.0 according to the seed data.