LLM Reference

Sarvam 30B

About

Sarvam 30B is an open-source MoE reasoning model with 30B total parameters. Optimized for math, coding, and knowledge tasks with state-of-the-art performance across 22 Indian languages. Supports multilingual voice calls and streaming inference.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

Rankings

Specifications

FamilySarvam
Released2026-03-22
Parameters30B (2.4B active)
Context65.5k
Architecturemoe
Specializationreasoning
LicenseApache 2.0
Trainingpretraining