LLM ReferenceLLM Reference

Aquila Chat 2 34B

About

The AquilaChat2-34B, developed by the Beijing Academy of Artificial Intelligence (BAAI), is part of the Aquila2 series and is a powerful large language model (LLM) designed to excel in a variety of language processing tasks. It features strong language understanding and generation skills, comparable to or surpassing GPT-3.5 in specific evaluations across eight secondary ability dimensions. It is equipped to handle diverse tasks such as chatbots, content generation, question answering, summarization, and supports multilingual capabilities in both English and Chinese. Its architecture incorporates efficient training methods and supports long-text handling with an extended context window of up to 16K tokens, thanks to its fine-tuning on high-quality long-dialogue datasets. Despite its capabilities, early versions faced issues with data leakage, which have likely been addressed in later iterations 145.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyAquila 2
Released2023-11-02
Parameters34B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Open-source AI fostering global collaboration

Beijing, China
Founded 2018
Website