LLM ReferenceLLM Reference

Aquila 2 7B

About

The Aquila 2 7B is a bilingual large language model crafted by the Beijing Academy of Artificial Intelligence, capable of processing both Chinese and English languages. It features a state-of-the-art transformer architecture, drawing design influences from GPT-3 and LLaMA. The model's training employed the innovative HeuriMentor framework, enhancing adaptive learning and efficient data management. With a token vocabulary of 100,000 and support for sequences up to 8192 tokens, it excels in various NLP tasks like text generation. Notably more efficient than previous models, Aquila 2 7B is optimized for commercial use under a specific license agreement and offers open-source access to its model weights and code.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyAquila 2
Released2023-11-02
Parameters7B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Open-source AI fostering global collaboration

Beijing, China
Founded 2018
Website