LLM ReferenceLLM Reference

ChatGLM-6B

chatglm-6b

Open Source

About

ChatGLM-6B is the original open-source bilingual (Chinese-English) chat model from THUDM (Tsinghua University KEG Lab and Zhipu AI). Released March 13, 2023, it was pre-trained on approximately 1 trillion Chinese-English tokens using supervised fine-tuning and RLHF for alignment. With a 2K context window and 6.2B parameters, it was one of the first widely-accessible open-source LLMs with strong Chinese-language capabilities. Predecessor to ChatGLM2-6B and the ChatGLM3 family.

ChatGLM-6B has a 2K-token context window.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning

Benchmark Scores(1)

BenchmarkScoreVersionSource
GAOKAO30.8zero-shot, objective-accuracyhttps://github.com/OpenLMLab/GAOKAO-Bench

Rankings

Specifications

FamilyChatGLM
Released2023-03-13
Parameters6.2B
Context2K
ArchitectureDecoder Only
Specializationgeneral
LicenseApache 2.0
Trainingpretrained

Created by

Leading China's LLM innovation surge

Beijing, China
Founded 2018
Website