LLM Reference
Mistral AI Le Plateforme

Mistral 7B v0.1 on Mistral AI Le Plateforme

Mistral 7B · MistralAI

Serverless

Pricing

TypePrice (per 1M)
Input tokens$0.07
Output tokens$0.07

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Mistral 7B v0.1

Mistral 7B v0.1 is an advanced open-source large language model built by Mistral AI, consisting of 7 billion parameters. It's designed to deliver high performance and efficiency, outperforming many similar-sized models in various benchmarks. The model employs a transformer architecture with innovative features like Sliding Window Attention, Grouped-Query Attention, and a Byte-fallback BPE tokenizer, enhancing speed, reducing computational costs, and improving robustness. Capable of generating human-like text, following instructions effectively, and excelling in areas such as reasoning and mathematics, Mistral 7B v0.1 does have limitations, such as a lack of built-in moderation and a potential for hallucinations. Subsequent versions have sought to address these limitations while introducing extended context windows and improved instruction-following capabilities.

Get Started