LLM Reference

Mistral Small 4

Multimodal

About

Mistral Small 4 is a hybrid 119B MoE model unifying instruct, reasoning, and coding capabilities. Features configurable reasoning effort per request and native function calling with JSON output support.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
NVIDIA NIMServerless

Rankings

Specifications

Released2026-03-16
Parameters119B (6.5B active)
Context256k
Architecturemoe
Specializationgeneral
LicenseApache 2.0
Trainingpretraining

Providers(1)