LLM ReferenceLLM Reference

Aquila 33B

About

The Aquila 33B model is a powerful bilingual large language model created by the Beijing Academy of Artificial Intelligence, supporting both Chinese and English. With 33 billion parameters and a transformer architecture, it rivals prominent models like GPT-3. Notably, it boasts nearly eight times the training efficiency of earlier models, thanks to enhanced operator implementations and a redesigned bilingual tokenizer. The model excels in tasks such as text and code generation, and conversational AI, while being governed by an open-source commercial license. Despite its strengths, users should consider potential biases in training data and the computational resources required for deployment.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyAquila
Released2023-10-12
Parameters33B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Open-source AI fostering global collaboration

Beijing, China
Founded 2018
Website