LLM Reference

MiniMax M2.5 on Together AI

Minimax

Serverless

Pricing

TypePrice (per 1M)
Input tokens$0.30
Output tokens$1.20

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About MiniMax M2.5

MiniMax: MiniMax M2.5 (free) available via OpenRouter. Pricing: $null/1M input, $null/1M output.

Get Started

Model Specs

Context197K
ArchitectureDecoder Only