ChatGLM2-6B
chatglm2-6b
Open Source
About
ChatGLM2-6B is the second-generation 6B open-source bilingual (Chinese-English) chat model from THUDM (Tsinghua University / Zhipu AI). Released June 25, 2023, it improved significantly over the original ChatGLM-6B: extended context to 32K tokens via FlashAttention, trained on 1.4T tokens (up from ~1T), and achieved approximately 23% better MMLU scores. Widely used as a Chinese-language LLM baseline. Succeeded by ChatGLM3-6B in October 2023.
ChatGLM2-6B has a 32K-token context window.
Capabilities
VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning
Benchmark Scores(1)
| Benchmark | Score | Version | Source |
|---|---|---|---|
| GAOKAO | 42.7 | zero-shot, objective-accuracy | https://github.com/OpenLMLab/GAOKAO-Bench |
API Versions
chatglm2-6b-32k