LLM ReferenceLLM Reference

K-EXAONE 236B-A23B

Open Source

About

LG AI Research's K-EXAONE 236B-A23B is a large-scale open-source Mixture-of-Experts model featuring 236B total parameters with 23B active during inference. It uses a fine-grained MoE design and supports six languages: Korean, English, Spanish, German, Japanese, and Vietnamese. K-EXAONE achieved first place in 10 of 13 benchmarks under South Korea's national AI foundation model project and ranked 7th globally on the Artificial Analysis Intelligence Index at launch. Free API access was available via Friendli.ai through January 2026.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Benchmark Scores(1)

BenchmarkScoreVersionSource
Google-Proof Q&A78.3diamondArtificial Analysis

Rankings

Specifications

FamilyK-EXAONE
Released2025-12-31
Parameters236B
Context256k
ArchitectureMoE

Created by

Advancing AI for a Better Life

South Korea
Founded 2020
Website