LLM ReferenceLLM Reference

Trinity-Large-Thinking on Arcee AI

Trinity · Arcee AI

Open Source

Get Started with Trinity-Large-Thinking on Arcee AI

Arcee AI offers access to Trinity-Large-Thinking with a 256K context window. Custom model fine-tuning and inference API platform

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

About Trinity-Large-Thinking

Arcee AI's flagship 400B sparse MoE reasoning model with 13B active parameters per token. Trained on 20T tokens with a STEM-focused curriculum. Designed for agentic workflows, chain-of-thought reasoning, and long-context tasks up to 256K tokens (BF16 API). Open-source under Apache 2.0. Available via Arcee AI API.

Model Specs

Released2026-04-01
Parameters400B
Context256K
ArchitectureSparse Mixture of Experts (MoE)