Pricing
| Type | Price (per 1M) |
|---|---|
| Input tokens | $0.65 |
| Output tokens | $0.65 |
Capabilities
VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution
About WizardLM-2 8x22B
WizardLM-2 8x22B, developed by WizardLM@Microsoft AI, is a powerful large language model (LLM) featuring 141 billion parameters and utilizing a Mixture of Experts (MoE) architecture. It excels in complex tasks such as chat, multilingual conversations, reasoning, and agent-based interactions. Trained with an AI-powered synthetic system incorporating techniques like Evol-Instruct and AI Align AI, the model surpasses many open-source alternatives. Despite its performance on various benchmarks, further research is essential to address potential biases and enhance reliability post "toxicity testing."
Model Specs
Released2024-01-09
Parameters8x22B
ArchitectureMixture of Experts