LLM ReferenceLLM Reference

Yi 34B 200K

About

The Yi 34B 200K is a sophisticated large language model by 01.AI that excels in varied NLP tasks, featuring an impressive 34 billion parameters and a 200,000-token context window to handle extensive text inputs. Built on a Transformer architecture, it differentiates itself from models like Llama by employing unique training methods such as Grouped-Query Attention and RoPE with adjusted base frequency. It showcases strengths in language comprehension, commonsense reasoning, and bilingual support for English and Chinese. Despite its advanced capabilities, it shares common LLM limitations like hallucination and non-determinism. The model was trained on a substantial 3 trillion token multilingual corpus and exhibits a superior 99.8% success rate in benchmark tests.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(3)

Compare all →
ProviderInput (per 1M)Output (per 1M)Type
Alibaba Cloud PAI-EASServerless
Fireworks AI$0.9$0.9Serverless
Replicate API$0.20$1.00Serverless

Rankings

Specifications

Released2023-11-02
Parameters34B
Context200K
ArchitectureDecoder Only
Knowledge cutoff2024-03
Specializationgeneral
Trainingfinetuning

Created by

Developing AI for creative and generative tasks.

Beijing, China
Founded 2023
Website