ChatGLM3 6B 32K
About
ChatGLM3-6B-32K, developed by THUDM, is a robust open-source large language model designed to excel in understanding and processing long texts with a remarkable context length of up to 32,000 tokens. Building on the architecture of its predecessors, it improves on the existing capabilities of the ChatGLM series by integrating an enhanced training strategy and a diverse dataset. This model excels in tasks such as long-form conversations, content generation, question answering, code generation, and multi-modal understanding, while maintaining smooth dialogue and effortless deployment. Its open-source nature, coupled with a permissive license, promotes both academic and commercial use, though users are advised to apply it responsibly to mitigate potential biases and ethical concerns.
Capabilities
Providers(1)
| Provider | Input (per 1M) | Output (per 1M) | Type | |
|---|---|---|---|---|
| Alibaba Cloud PAI-EAS | — | — | Serverless |