LLM Reference

InternLM2 Math Plus 7B

About

InternLM2 Math Plus 7B is a sophisticated large language model tailored for mathematical reasoning and problem-solving. It excels at both formal and informal mathematics, achieving impressive scores on benchmarks like MATH, MATH-Python, and GSM8K. Built on the InternLM2 architecture, it features a transformer-based design allowing for efficient processing of long-context data. The model supports chain-of-thought reasoning and integrates with Lean for formal proof verification. It is trained on a rich dataset, including 100 billion math-related tokens, which equips it to handle a wide array of mathematical tasks. However, like other models, it can produce biased outputs and struggles with complex contextual reasoning.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

Released2024-05-24
Parameters7B
ArchitectureDecoder Only
Specializationgeneral