LLM ReferenceLLM Reference

Trinity Nano on Arcee AI

Trinity · Arcee AI

ServerlessOpen Source

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

About Trinity Nano

6B sparse MoE with ~1B active parameters per token and 128K context window. Trained on 10T tokens. Personality-forward chat model designed for edge deployment and fast inference; experimental/preview status. Open-source under Apache 2.0.

Get Started

Model Specs

Released2025-02-01
Parameters6B
Context128K
ArchitectureSparse Mixture of Experts (MoE)