LLM Reference

Mistral Small

About

Mistral Small is a highly efficient large language model featuring 22 billion parameters and a generous 32,000-token context window, designed for both high-volume tasks and low-latency performance 17. It is adept at natural language processing tasks like text generation and question answering, further excelling in multiple programming languages and offering multilingual support, including English, French, and Italian 26. Notably, Mistral Small supports function calling, allowing for complex task executions through external tools and APIs 7, whilst maintaining a balance between performance and cost-effectiveness, suitable for enterprise-grade applications 8. The version 24.09 marks a significant improvement in its capabilities and efficiency compared to its predecessors 12.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(3)

ProviderInput (per 1M)Output (per 1M)Type
Azure OpenAI$1$3Provisioned
AWS Bedrock$1$3Serverless
Mistral AI Le PlateformeServerless

Specifications

FamilyMistral
Released2024-02-26
Context32K
ArchitectureDecoder Only
Specializationgeneral