Pricing
| Type | Price (per 1M) |
|---|---|
| Input tokens | $3.00 |
| Output tokens | $9.00 |
Capabilities
VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution
About Mistral Large 2 (2407)
Flagship sparse MoE Mistral model (675B total, 41B active) with 256K context and multimodal capabilities. Leads benchmarks in complex reasoning and long-context processing.
Get Started
Model Specs
Released2024-07-23
Parameters123B
Context128K
ArchitectureDecoder Only