WizardLM-2 8x22B
About
WizardLM-2 8x22B, developed by WizardLM@Microsoft AI, is a powerful large language model (LLM) featuring 141 billion parameters and utilizing a Mixture of Experts (MoE) architecture. It excels in complex tasks such as chat, multilingual conversations, reasoning, and agent-based interactions. Trained with an AI-powered synthetic system incorporating techniques like Evol-Instruct and AI Align AI, the model surpasses many open-source alternatives. Despite its performance on various benchmarks, further research is essential to address potential biases and enhance reliability post "toxicity testing."
Capabilities
MultimodalFunction CallingTool UseJSON Mode
Providers(3)
| Provider | Input (per 1M) | Output (per 1M) | Type | |
|---|---|---|---|---|
| deepinfra API | — | — | Serverless | |
| Lepton AI API | — | — | Serverless | |
| OctoAI API | $1.2 | $1.2 | Serverless |