LLM ReferenceLLM Reference

ChatGLM2-6B

chatglm2-6b

Open Source

About

ChatGLM2-6B is the second-generation 6B open-source bilingual (Chinese-English) chat model from THUDM (Tsinghua University / Zhipu AI). Released June 25, 2023, it improved significantly over the original ChatGLM-6B: extended context to 32K tokens via FlashAttention, trained on 1.4T tokens (up from ~1T), and achieved approximately 23% better MMLU scores. Widely used as a Chinese-language LLM baseline. Succeeded by ChatGLM3-6B in October 2023.

ChatGLM2-6B has a 32K-token context window.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning

Benchmark Scores(1)

BenchmarkScoreVersionSource
GAOKAO42.7zero-shot, objective-accuracyhttps://github.com/OpenLMLab/GAOKAO-Bench

API Versions

chatglm2-6b-32k

Rankings

Specifications

FamilyChatGLM2
Released2023-06-25
Parameters6.2B
Context32K
ArchitectureDecoder Only
Specializationgeneral
Trainingpretrained

Created by

Leading China's LLM innovation surge

Beijing, China
Founded 2018
Website