
WizardLM-2
About
The WizardLM-2 is a family of advanced large language models (LLMs) developed by Microsoft AI. This series includes three cutting-edge models: WizardLM-2 8x22B, WizardLM-2 70B, and WizardLM-2 7B. The flagship model, WizardLM-2 8x22B, is a Mixture of Experts (MoE) architecture with 141 billion parameters, built upon the Mixtral-8x22B-v0.1 base model. These models demonstrate highly competitive performance in complex chat, multilingual tasks, reasoning, and agent-based interactions, often rivaling or surpassing proprietary models in benchmarks like MT-Bench. The WizardLM-2 family was trained using a fully AI-powered synthetic training system, which contributes to their advanced capabilities in various domains including writing, coding, mathematics, and multilingual tasks.