LLM ReferenceLLM Reference

Trinity-Large-Preview on OpenRouter

Trinity · Arcee AI

ServerlessOpen Source

Get Started with Trinity-Large-Preview on OpenRouter

OpenRouter offers access to Trinity-Large-Preview with a 128K context window. Multi-provider LLM aggregator offering unified API access to 300+ models from all major labs and emerging providers, with automatic failover for reliability.

Pricing

TypePrice (per 1M)
Input tokensFree
Output tokensFree

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

About Trinity-Large-Preview

400B sparse MoE instruct model with 13B active parameters per token, served at 128K context via 8-bit quantized API. Trained on 20T tokens. Production-ready for agentic and tool-use applications; predecessor to Trinity-Large-Thinking. Available free on OpenRouter.

Model Specs

Released2026-01-27
Parameters400B
Context128K
ArchitectureSparse Mixture of Experts (MoE)

Related Models on OpenRouter