InternLM2 Math Plus 1.8B
About
InternLM2-Math-Plus 1.8B is a large language model tailored for mathematical reasoning, offering an optimal blend of performance and efficiency. It excels in both informal and formal mathematical reasoning tasks, as demonstrated by its impressive scores on benchmarks like MATH, MATH-Python, GSM8K, and MiniF2F-test. The model is constructed on the InternLM2 architecture, featuring 1.8 billion parameters, and is trained on a rich dataset of high-quality mathematical data. Notable for its efficiency and bilingual support (Chinese and English), it also employs chain-of-thought reasoning to enhance its problem-solving skills. However, specifics about its architectural details and broader performance remain limited.
Capabilities
MultimodalFunction CallingTool UseJSON Mode
Specifications
FamilyInternLM2-Math
Released2024-05-24
Parameters1.8B
ArchitectureDecoder Only
Specializationgeneral