LLM Reference

Aquila Chat 33B

About

AquilaChat-33B is a bilingual large language model (LLM) developed by the Beijing Academy of Artificial Intelligence, supporting both Chinese and English. It utilizes the transformer architecture similar to models like GPT-3 and LLaMA. The training data consists of a high-quality corpus with approximately 40% Chinese content from over 10,000 domestic internet sources and literature. Compliant with domestic data regulations, AquilaChat-33B is designed for dialogue and multilingual generation tasks. It offers extensibility for integration with other models, such as AltDiffusion for multimodal tasks. While currently under development, it faces potential limitations typical of LLMs, including challenges with idiomatic expressions, possible biases, and significant computational demands. The model is partly open-source under the Apache 2.0 license, with some commercial use restrictions.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

FamilyAquila
Parameters33B
ArchitectureDecoder Only
Specializationgeneral