LLM Reference
Fireworks AI

Yi 34B 200K on Fireworks AI

Yi (2023/11) · 01.AI

Serverless

Pricing

TypePrice (per 1M)
Input tokens$0.90
Output tokens$0.90

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Yi 34B 200K

The Yi 34B 200K is a sophisticated large language model by 01.AI that excels in varied NLP tasks, featuring an impressive 34 billion parameters and a 200,000-token context window to handle extensive text inputs. Built on a Transformer architecture, it differentiates itself from models like Llama by employing unique training methods such as Grouped-Query Attention and RoPE with adjusted base frequency. It showcases strengths in language comprehension, commonsense reasoning, and bilingual support for English and Chinese. Despite its advanced capabilities, it shares common LLM limitations like hallucination and non-determinism. The model was trained on a substantial 3 trillion token multilingual corpus and exhibits a superior 99.8% success rate in benchmark tests.

Get Started

Model Specs

Released2023-11-02
Parameters34B
Context200K
ArchitectureDecoder Only
Knowledge cutoff2024-03

Related Models on Fireworks AI