LLM ReferenceLLM Reference

Mathstral

MistralAIMathematics
1 model2024Up to 32K ctx

About

Mathstral is a family of large language models (LLMs) created by Mistral AI, focusing on mathematical reasoning and scientific exploration. Built on the Mistral 7B architecture, they inherit its efficient design and are fine-tuned specifically for STEM (Science, Technology, Engineering, and Mathematics) fields 124. These models are adept at handling complex, multi-step logical reasoning tasks and are available under the Apache 2.0 license, facilitating open-source collaboration 148. A prominent characteristic is their enhanced performance with increased inference-time computation, offering scalability for various applications 1. They provide leading reasoning abilities for their size category, as evidenced by industry-standard benchmarks 18. Notably, the Mathstral 7B v0.1 model supports a 32k context window, allowing it to tackle more extensive and intricate mathematical challenges 1. Model weights are accessible via Hugging Face, facilitating research and development efforts 1.

Specifications(1 models)

Mathstral model specifications comparison
ModelReleasedContextParameters
Mathstral 7B2024-0732K7B

Frequently Asked Questions

What is Mathstral?
Mathstral is a family of large language models (LLMs) created by Mistral AI, focusing on mathematical reasoning and scientific exploration. Built on the Mistral 7B architecture, they inherit its efficient design and are fine-tuned specifically for STEM (Science, Technology, Engineering, and Mathematics) fields 124. These models are adept at handling complex, multi-step logical reasoning tasks and are available under the Apache 2.0 license, facilitating open-source collaboration 148. A prominent characteristic is their enhanced performance with increased inference-time computation, offering scalability for various applications 1. They provide leading reasoning abilities for their size category, as evidenced by industry-standard benchmarks 18. Notably, the Mathstral 7B v0.1 model supports a 32k context window, allowing it to tackle more extensive and intricate mathematical challenges 1. Model weights are accessible via Hugging Face, facilitating research and development efforts 1.
How many models are in the Mathstral family?
The Mathstral family contains 1 model.
What is the latest Mathstral model?
The latest model is Mathstral 7B, released in 2024-07.

Models(1)