LLM ReferenceLLM Reference

Granite 4.1 30B Base

granite-4.1-30b-base

Open Source

About

IBM Granite 4.1 30B Base (29B actual parameters) is the pre-training checkpoint with 512K token context via long-context extension in training phase 5. Architecture: 64 layers, 4096 embedding size, 32 attn heads, 8 KV heads, 32768 MLP hidden size, SwiGLU. Trained on ~15 trillion tokens across 5 phases. Apache 2.0.

Granite 4.1 30B Base has a 512K-token context window.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

Released2026-04-29
Parameters29B
Context512K
ArchitectureDense decoder-only transformer: 64 layers, 4096 embed, 32 attn heads, 8 KV heads, 32768 MLP SwiGLU, RoPE

Created by

Creating reliable and adaptable AI solutions

Armonk, New York, United States
Founded 1945
Website