LLM ReferenceLLM Reference

Trinity-Large-Thinking on OpenRouter

Trinity · Arcee AI

ServerlessOpen Source

Get Started with Trinity-Large-Thinking on OpenRouter

OpenRouter offers access to Trinity-Large-Thinking with a 256K context window. Multi-provider LLM aggregator offering unified API access to 300+ models from all major labs and emerging providers, with automatic failover for reliability.

Pricing

TypePrice (per 1M)
Input tokens$0.22
Output tokens$0.85

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

About Trinity-Large-Thinking

Arcee AI's flagship 400B sparse MoE reasoning model with 13B active parameters per token. Trained on 20T tokens with a STEM-focused curriculum. Designed for agentic workflows, chain-of-thought reasoning, and long-context tasks up to 256K tokens (BF16 API). Open-source under Apache 2.0. Available via Arcee AI API.

Model Specs

Released2026-04-01
Parameters400B
Context256K
ArchitectureSparse Mixture of Experts (MoE)

Related Models on OpenRouter