LLM Reference

YandexGPT 5 Lite

Open Sourceopen_source

About

8B parameter model optimized for speed and real-time responses. Bilingual (Russian and English) support with LLaMA-like architecture. Trained on 15 trillion tokens across two stages with context length increased to 32k in stage 2.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

API Versions

instructpretrain

Rankings

Specifications

FamilyYandexGPT
Released2025-02-25
Parameters8000000000
Context32k
ArchitectureDecoder Only
Specializationgeneral

Created by

Innovative AI merges search with language

Moscow, Russia
Founded 1997
Website