Why use Mistral Small on DeepInfra?
DeepInfra offers Mistral Small with competitive pricing. DeepInfra is a cloud inference platform offering cost-effective access to open-source AI models.
Compare Mistral Small across 5 providers to find the best fit for your use caseCompare Mistral Small Across Providers
| Provider | Input (per 1M) | Output (per 1M) |
|---|---|---|
| Microsoft Foundry | $1.00 | $3.00 |
| AWS Bedrock | $1.00 | $3.00 |
| Mistral AI Studio | $0.10 | $0.30 |
| Fireworks AI | — | — |
| DeepInfra | — | — |
Capabilities
VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution
About Mistral Small
Mistral Small available on AWS Bedrock
FAQ
What is the context window for Mistral Small on DeepInfra?
Mistral Small supports a 32,000 token context window on DeepInfra.
How does DeepInfra compare to other Mistral Small providers?
Mistral Small is available from 5 providers. The cheapest input pricing is $0.10/1M tokens from Mistral AI Studio.
Who created Mistral Small?
Mistral Small was created by MistralAI as part of the Mistral Small model family.
Is Mistral Small open source?
Mistral Small is open source under Apache 2.0 according to the seed data.
Model Specs
Released2024-02-26
Context32K
ArchitectureDecoder Only