LLM Reference

Gemma 4 26B A4B

Open SourceMultimodal

About

Mixture of Experts (MoE) model with 26B total parameters, activating 4B per token. Efficient high-performance inference with advanced reasoning, supporting text, image, and video inputs.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

Rankings

Specifications

FamilyGemma 4
Released2026-04-02
Parameters26000000000
Context256k
Specializationmixture-of-experts