LLM ReferenceLLM Reference

WizardLM-2 8x22B

About

WizardLM-2 8x22B, developed by WizardLM@Microsoft AI, is a powerful large language model (LLM) featuring 141 billion parameters and utilizing a Mixture of Experts (MoE) architecture. It excels in complex tasks such as chat, multilingual conversations, reasoning, and agent-based interactions. Trained with an AI-powered synthetic system incorporating techniques like Evol-Instruct and AI Align AI, the model surpasses many open-source alternatives. Despite its performance on various benchmarks, further research is essential to address potential biases and enhance reliability post "toxicity testing."

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(4)

Compare all →
ProviderInput (per 1M)Output (per 1M)Type
DeepInfra$0.65$0.65Serverless
Lepton AI API$0.50$0.50Serverless
OctoAI API$1.2$1.2Serverless
OpenRouter$0.62$0.62Serverless

Benchmark Scores(2)

BenchmarkScoreVersionSource
HumanEval67.5pass@1Open LLM Leaderboard
Massive Multitask Language Understanding76.95-shotOpen LLM Leaderboard

Rankings

Specifications

Released2024-01-09
Parameters8x22B
ArchitectureMixture of Experts
Specializationgeneral
Trainingfinetuning

Created by

Unrestricted AI role-play and story generator

N/A
Founded N/A
Website