LLM ReferenceLLM Reference

Mistral Medium

mistral-medium

Deprecated

About

Mistral Medium is a versatile large language model developed by Mistral AI, designed to handle a wide array of tasks with a robust 32k token context window, allowing it to process approximately 24,000 words. Built on a transformer architecture, it offers native fluency in multiple languages, including English, French, Spanish, German, and Italian, enhancing its multilingual reasoning capabilities. Available via API, Mistral Medium is proprietary and stronger than some of Mistral AI's open-source models like Mixtral 8x7B and Mistral-7B. While it is described as more cost-effective than models such as GPT-4, specific pricing details are not provided 11011.

Mistral Medium has a 32K-token context window.

Mistral Medium input tokens at $0.4/1M, output at $2/1M.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(2)

Compare all →
ProviderInput (per 1M)Output (per 1M)Type
Mistral AI Studio$2.7$8.1Serverless
OpenRouter$0.4$2Serverless

Benchmark Scores(4)

BenchmarkScoreVersionSource
Google-Proof Q&A58.9diamondresearch
HellaSwag93.910-shotresearch
HumanEval84.3pass@1research
Massive Multitask Language Understanding82.95-shotresearch

Rankings

Specifications

Released2023-12-11
Context32K
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuned

Created by

Enterprise AI solutions for trust and transparency.

Paris, France
Founded 2023
Website