InternLM2 Math Plus 20B
About
InternLM2-Math-Plus 20B is a sophisticated large language model tailored for mathematical reasoning, developed by the InternLM research group. Equipped with 20 billion parameters, it serves as a solver, prover, verifier, and augmentor for mathematical challenges 1. The model enhances the InternLM2 base architecture, focusing on improvements in chain-of-thought reasoning, code interpretation, and formal mathematical reasoning using LEAN 4 2. Its training process included pre-training on approximately 100 billion high-quality math-related tokens and supervised fine-tuning (SFT) with around 2 million bilingual math datasets 3. Achieving state-of-the-art results on benchmarks like MiniF2F-test and MATH, it is open-sourced and accessible on Hugging Face 1. Quantized versions cater to different hardware requirements, despite some occasional limitations in problem-solving 5.