ChatGLM-6B
chatglm-6b
Open Source
About
ChatGLM-6B is the original open-source bilingual (Chinese-English) chat model from THUDM (Tsinghua University KEG Lab and Zhipu AI). Released March 13, 2023, it was pre-trained on approximately 1 trillion Chinese-English tokens using supervised fine-tuning and RLHF for alignment. With a 2K context window and 6.2B parameters, it was one of the first widely-accessible open-source LLMs with strong Chinese-language capabilities. Predecessor to ChatGLM2-6B and the ChatGLM3 family.
ChatGLM-6B has a 2K-token context window.
Capabilities
VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning
Benchmark Scores(1)
| Benchmark | Score | Version | Source |
|---|---|---|---|
| GAOKAO | 30.8 | zero-shot, objective-accuracy | https://github.com/OpenLMLab/GAOKAO-Bench |