LLM ReferenceLLM Reference

Hunyuan Large

hunyuan-large

Open SourceChina

About

Hunyuan-Large, also known as Hunyuan-MoE-A52B, is Tencent's open-source Transformer-based Mixture of Experts language model with 389 billion total parameters and 52 billion active parameters. The instruction-tuned checkpoints support 128K context, while the pretraining checkpoint supports 256K. Tencent's release includes pretrain, instruct, and FP8 instruct checkpoints, with GQA and Cross-Layer Attention for KV cache compression. Sources: https://huggingface.co/tencent/Tencent-Hunyuan-Large and https://github.com/Tencent-Hunyuan/Tencent-Hunyuan-Large

Hunyuan Large has a 128K-token context window.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning

API Versions

hunyuan-moe-a52bhunyuan-a52b-instructhunyuan-a52b-instruct-fp8hunyuan-a52b-pretrain

Rankings

Specifications

FamilyHunyuan
Released2024-11-04
Parameters389B (52B active)
Context128K
ArchitectureMixture of Experts
LicenseTencent Hunyuan Community License
Trainingpretrained

Created by

AI innovations for societal improvement

Shenzhen, China
Founded 2016
Website