Using Mixtral 8x7B Instruct v0.1 on Together AI
Implementation guide · Mixtral · MistralAI
ServerlessOpen Source
Quick Start
- 1
- 2Use the Together AI SDK or REST API to call
mixtral-8x7b-instruct-v0.1— see the documentation for request format. - 3
Code Examples
Install
pip install togetherAPI key
TOGETHER_API_KEYModel ID
mixtral-8x7b-instruct-v0.1Together uses "organization/model-name" format, e.g. "meta-llama/Llama-4-Scout-17B-16E-Instruct" or "Qwen/QwQ-32B". See the Together model catalog for the exact ID.
from together import Together
client = Together() # reads TOGETHER_API_KEY from env
response = client.chat.completions.create(
model="mixtral-8x7b-instruct-v0.1",
messages=[{"role": "user", "content": "Hello"}]
)
print(response.choices[0].message.content)About Together AI
Platform for running open-source and proprietary LLMs
Together AI is a platform for running open-source and proprietary LLMs with fast serverless and dedicated endpoints at competitive inference pricing.
Pricing on Together AI
| Type | Price (per 1M) |
|---|---|
| Input tokens | $0.40 |
| Output tokens | $0.40 |
Capabilities
VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution
Model Specs
Released2023-12-10
Parameters56B
Context33K
ArchitectureDecoder Only
Knowledge cutoff2023-12