InternLM2 Math Plus Mixtral 8x22B
About
InternLM2 Math Plus Mixtral 8x22B is a sophisticated large language model designed for advanced mathematical reasoning. It boasts bilingual proficiency in English and Chinese, catering to a diverse user base. With an open-source availability on platforms like Hugging Face, this model offers multiple sizes (1.8B, 7B, 20B, and 8x22B parameters) to accommodate various computational needs. It excels in both formal and informal reasoning, employs chain-of-thought processes, and uses a code interpreter for enhanced problem-solving. Its performance is highly regarded, often surpassing comparable models in benchmark tests, though it requires significant computational resources and can be prone to errors in complex problems.
Capabilities
MultimodalFunction CallingTool UseJSON Mode
Specifications
FamilyInternLM2-Math
Released2024-05-24
Parameters8x22B
ArchitectureMixture of Experts
Specializationgeneral