LLM Reference

Mistral Small 3.1 24B Instruct

Multimodal

About

Mistral's Small 3.1 24B model with multimodal vision understanding capabilities. Optimized for cost-efficient deployment with 128K token context window. Available on Cloudflare Workers AI.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

FamilyMistral
Released2025-12-15
Parameters24B
Context128K
ArchitectureDense
Specializationgeneral
Trainingpretraining
Fine-tuninginstruction-tuning