LLM Reference

InternLM2 20B

About

InternLM2 20B is a cutting-edge language model developed by the Shanghai Artificial Intelligence Laboratory, featuring an impressive 20 billion parameters and a transformer architecture with 60 layers. It excels in natural language understanding, mathematical reasoning, and code generation, making it ideal for diverse tasks such as conversational AI and technical problem-solving. Trained on over 2.3 trillion tokens in multiple languages, including English and Chinese, it supports ultra-long contexts of up to 200,000 characters and is open source, allowing wide-reaching application. Despite its advances, challenges like bias and common sense reasoning persist.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

FamilyInternLM2
Released2024-01-12
Parameters20B
ArchitectureDecoder Only
Specializationgeneral