LLM ReferenceLLM Reference
This model family is considered obsolete. Consider newer alternatives in Related Model Families below.
1 model2023Up to 2K ctx

About

The original ChatGLM family of open-source bilingual (Chinese-English) chat models from THUDM (Tsinghua University KEG Lab and Zhipu AI). Pre-trained on approximately 1 trillion Chinese-English tokens with RLHF alignment, the ChatGLM-6B model achieved strong bilingual performance at the 6B parameter scale and became a widely-used baseline in Chinese-language LLM research. Superseded by ChatGLM2 in June 2023.

Specifications(1 models)

ChatGLM model specifications comparison
ModelReleasedContextParameters
ChatGLM-6B2023-032K6.2B

Frequently Asked Questions

What is ChatGLM?
The original ChatGLM family of open-source bilingual (Chinese-English) chat models from THUDM (Tsinghua University KEG Lab and Zhipu AI). Pre-trained on approximately 1 trillion Chinese-English tokens with RLHF alignment, the ChatGLM-6B model achieved strong bilingual performance at the 6B parameter scale and became a widely-used baseline in Chinese-language LLM research. Superseded by ChatGLM2 in June 2023.
How many models are in the ChatGLM family?
The ChatGLM family contains 1 model.
What is the latest ChatGLM model?
The latest model is ChatGLM-6B, released in 2023-03.

Models(1)