LLM Reference

Mistral Medium

Deprecated

About

Mistral Medium is a versatile large language model developed by Mistral AI, designed to handle a wide array of tasks with a robust 32k token context window, allowing it to process approximately 24,000 words. Built on a transformer architecture, it offers native fluency in multiple languages, including English, French, Spanish, German, and Italian, enhancing its multilingual reasoning capabilities. Available via API, Mistral Medium is proprietary and stronger than some of Mistral AI's open-source models like Mixtral 8x7B and Mistral-7B. While it is described as more cost-effective than models such as GPT-4, specific pricing details are not provided 11011.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Mistral AI Le PlateformeServerless

Benchmark Scores(4)

BenchmarkScoreVersionSource
Google-Proof Q&A58.9diamondresearch
HellaSwag93.910-shotresearch
HumanEval84.3pass@1research
Massive Multitask Language Understanding82.95-shotresearch

Specifications

FamilyMistral
Released2023-12-11
Context32K
ArchitectureDecoder Only
Specializationgeneral