LLM Reference
InternLM

InternLM

About

The InternLM family is a collection of open-source large language models developed by the Shanghai AI Laboratory, available in parameter sizes of 1.8B, 7B, and 20B. These models are designed for practical use and are trained on trillions of high-quality tokens, endowing them with a comprehensive knowledge base. They feature support for long context windows, up to 1 million tokens in some versions, and offer enhanced reasoning capabilities, particularly in mathematics, along with improved tool usage. The range includes specialized models like InternLM-XComposer, which excels in tasks involving text-image comprehension and composition. These models are accessible on platforms like Hugging Face and ModelScope, with the code released under the Apache-2.0 license 313.

Models(2)

Details

ResearcherIntern-AI
Models2