Pricing
| Type | Price (per 1M) |
|---|---|
| Input tokens | $0.90 |
| Output tokens | $0.90 |
Capabilities
About Yi 34B 200K
The Yi 34B 200K is a sophisticated large language model by 01.AI that excels in varied NLP tasks, featuring an impressive 34 billion parameters and a 200,000-token context window to handle extensive text inputs. Built on a Transformer architecture, it differentiates itself from models like Llama by employing unique training methods such as Grouped-Query Attention and RoPE with adjusted base frequency. It showcases strengths in language comprehension, commonsense reasoning, and bilingual support for English and Chinese. Despite its advanced capabilities, it shares common LLM limitations like hallucination and non-determinism. The model was trained on a substantial 3 trillion token multilingual corpus and exhibits a superior 99.8% success rate in benchmark tests.