LLM ReferenceLLM Reference
GCP Vertex AI

Mistral Large on GCP Vertex AI

Mistral Large · MistralAI

Serverless

Why use Mistral Large on GCP Vertex AI?

GCP Vertex AI offers Mistral Large with pay-as-you-go pricing at $0.32/1M input tokens. Vertex AI is Google Cloud's managed AI platform, offering access to Gemini models and hundreds of partner models alongside tools for fine-tuning, grounding, vector search, and end-to-end MLOps pipelines.

Compare Mistral Large across 8 providers to find the best fit for your use case

Compare Mistral Large Across Providers

ProviderInput (per 1M)Output (per 1M)
NVIDIA NIM
Microsoft Foundry$4.00$12.00
AWS Bedrock$2.00$6.00
Mistral AI Studio$2.00$6.00
IBM watsonx$10.00$10.00
View all 8 providers →

Pricing

TypePrice (per 1M)
Input tokens$0.32
Output tokens$0.96

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

About Mistral Large

Mistral Large available on AWS Bedrock

FAQ

What does Mistral Large cost on GCP Vertex AI?

On GCP Vertex AI, Mistral Large costs $0.32 per 1M input tokens and $0.96 per 1M output tokens.

What is the context window for Mistral Large on GCP Vertex AI?

Mistral Large supports a 32,000 token context window on GCP Vertex AI.

How does GCP Vertex AI compare to other Mistral Large providers?

Mistral Large is available from 8 providers. The cheapest input pricing is $0.32/1M tokens from GCP Vertex AI.

Who created Mistral Large?

Mistral Large was created by MistralAI as part of the Mistral Large model family.

Is Mistral Large open source?

Mistral Large is not open source; the seed data lists it as proprietary.

Get Started