LLM Reference

Yi 34B 200K

About

The Yi 34B 200K is a sophisticated large language model by 01.AI that excels in varied NLP tasks, featuring an impressive 34 billion parameters and a 200,000-token context window to handle extensive text inputs. Built on a Transformer architecture, it differentiates itself from models like Llama by employing unique training methods such as Grouped-Query Attention and RoPE with adjusted base frequency. It showcases strengths in language comprehension, commonsense reasoning, and bilingual support for English and Chinese. Despite its advanced capabilities, it shares common LLM limitations like hallucination and non-determinism. The model was trained on a substantial 3 trillion token multilingual corpus and exhibits a superior 99.8% success rate in benchmark tests.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(2)

ProviderInput (per 1M)Output (per 1M)Type
Alibaba Cloud PAI-EAS
Serverless
Replicate API
Serverless

Specifications

Parameters34B
Context200K
ArchitectureDecoder Only
Specializationgeneral