LLM Reference

About

Baichuan 2 is a family of advanced multilingual language models developed by Baichuan Intelligent Technology, featuring models with 7 billion and 13 billion parameters. These models are available as both base and chat versions, optimized for instruction following. Trained from scratch on an impressive dataset comprising 2.6 trillion tokens, Baichuan 2 models offer robust performance across various benchmarks, excelling in mathematics, code generation, and multilingual tasks. The open-source nature of these models allows for both research and commercial applications, subject to licensing agreements. For efficient inference, a 4-bit quantized version of the chat models is provided, and Baichuan has released intermediate training checkpoints to support research on large language model training dynamics 1258.

Models(4)