LLM ReferenceLLM Reference

ChatGLM3 6B 32K

About

ChatGLM3-6B-32K, developed by THUDM, is a robust open-source large language model designed to excel in understanding and processing long texts with a remarkable context length of up to 32,000 tokens. Building on the architecture of its predecessors, it improves on the existing capabilities of the ChatGLM series by integrating an enhanced training strategy and a diverse dataset. This model excels in tasks such as long-form conversations, content generation, question answering, code generation, and multi-modal understanding, while maintaining smooth dialogue and effortless deployment. Its open-source nature, coupled with a permissive license, promotes both academic and commercial use, though users are advised to apply it responsibly to mitigate potential biases and ethical concerns.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Alibaba Cloud PAI-EASServerless

Rankings

Specifications

FamilyChatGLM3
Released2024-01-30
Parameters6B
Context32K
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Leading China's LLM innovation surge

Beijing, China
Founded 2018
Website