Nous Hermes 2 Mixtral 8x7B
About
Mixtral MoE variant of Hermes trained on 1M+ GPT-4 entries for content generation and customer service. Available in quantized formats (GGUF, GPTQ, AWQ) for flexible deployment.
Capabilities
VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution
Providers(3)
Compare all →| Provider | Input (per 1M) | Output (per 1M) | Type | |
|---|---|---|---|---|
| OctoAI API | $0.15 | $0.15 | Serverless | |
| Fireworks AI | $0.50 | $0.50 | Provisioned | |
| Together AI | $0.6 | $0.6 | Serverless |