LLM Reference

Aquila Chat 2 70B Expressive

About

AquilaChat2-70B-Expr is a cutting-edge large language model designed by the Beijing Academy of Artificial Intelligence (BAAI) as part of their Aquila2 series. This experimental model boasts 70 billion parameters and excels in text generation, dialogue, and reasoning across both Chinese and English languages. It is built on a bilingual dataset, enabling native-like fluency. The model is open-source, with its weights and training code available for further research and development. Although powerful, it remains an experimental version, thus its performance may vary and be subject to limitations common in AI models, such as biases and potential gaps in up-to-date knowledge.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

FamilyAquila 2
Parameters70B
ArchitectureDecoder Only
Specializationgeneral