LLM Reference

Baichuan 7B

About

Baichuan-7B is an open-source large language model developed by Baichuan Intelligent Technology. Built on the Transformer architecture, it features 7 billion parameters and has been trained on about 1.2 trillion tokens. It supports both Chinese and English and has a context window length of 4096 tokens. This model excels in natural language processing tasks, such as text generation, question answering, and language understanding. It has achieved state-of-the-art performance in its size category on benchmarks like C-Eval and MMLU. However, it shares common limitations with other large language models, such as the potential for generating factually incorrect or biased outputs.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

FamilyBaichuan
Parameters7B
ArchitectureDecoder Only
Specializationgeneral