LLM ReferenceLLM Reference

Trinity Mini on OpenRouter

Trinity · Arcee AI

ServerlessOpen Source

Get Started with Trinity Mini on OpenRouter

OpenRouter offers access to Trinity Mini with a 128K context window. Multi-provider LLM aggregator offering unified API access to 300+ models from all major labs and emerging providers, with automatic failover for reliability.

Pricing

TypePrice (per 1M)
Input tokens$0.04
Output tokens$0.15

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

About Trinity Mini

26B sparse MoE with 3B active parameters per token and 128K context window. Trained on 10T tokens. Fully post-trained for reasoning and instruction following, suitable for cloud or on-premises deployment. Available via Arcee AI API and OpenRouter.

Model Specs

Released2025-12-01
Parameters26B
Context128K
ArchitectureSparse Mixture of Experts (MoE)