LLM Reference

InternLM 20B on Bitdeer AI

InternLM · Intern-AI

Serverless

Pricing

TypePrice (per 1M)
Input tokens$0.14
Output tokens$0.42

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About InternLM 20B

InternLM-20B is a sophisticated 20-billion parameter language model developed by the Shanghai Artificial Intelligence Laboratory alongside SenseTime Technology, the Chinese University of Hong Kong, and Fudan University. It features an extensive 60-layer deep architecture, surpassing the typical structure of smaller models with 32 or 40 layers. Trained on over 2.3 trillion tokens of curated English, Chinese, and code data, it includes enhanced datasets for improved reasoning and understanding. The model excels in understanding, reasoning, and programming, often outperforming larger models in certain benchmarks. It supports a 16,000-token context, enabling it to tackle complex inputs and reasoning tasks efficiently. Despite its capabilities, it has limitations, such as potential biases and probabilistic outputs. Additionally, a 4-bit quantized variant exists for greater efficiency at reduced accuracy.

Get Started

Model Specs

Released2023-07-06
Parameters20B
ArchitectureDecoder Only

Related Models on Bitdeer AI

Provider

Bitdeer AI

Bitdeer Technologies Group

All models on Bitdeer AI