LLM ReferenceLLM Reference

Hugging Face Mistral 7B v0.1 (Serverless)

huggingface-mistral-7b-v0.1-serverless

Open Source

About

Mistral 7B v0.1 Serverless on Hugging Face Inference

Hugging Face Mistral 7B v0.1 (Serverless) has a 32K-token context window.

Hugging Face Mistral 7B v0.1 (Serverless) input tokens at $0.05/1M, output at $0.15/1M.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Hugging Face Inference Endpoints$0.05$0.15Serverless

Rankings

Specifications

Released2023-12-11
Parameters7B
Context33K
ArchitectureDecoder Only
Knowledge cutoff2023-12
Specializationgeneral

Created by

Enterprise AI solutions for trust and transparency.

Paris, France
Founded 2023
Website