LLM Reference

Baichuan 2 7B

About

Baichuan 2 7B is a cutting-edge large language model developed by Baichuan Intelligent Technology, containing 7 billion parameters and trained on an expansive dataset of 2.6 trillion tokens. Built on the Transformer architecture, it excels in a wide range of natural language processing tasks such as text generation, question answering, and multilingual translation. It performs exceptionally well on benchmarks like C-Eval and MMLU, often surpassing models like LLaMA-7B and GPT-3.5 Turbo. Supporting both Chinese and English, it features a context window of 4096 tokens and is optimized for efficient inference, making it versatile for different applications. The model is open-source, allowing for academic and commercial use after obtaining permission 24.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Alibaba Cloud PAI-EAS
Serverless

Specifications

Parameters7B
ArchitectureDecoder Only
Specializationgeneral