LLM ReferenceLLM Reference

AFM 4.5B

Open Source

About

Arcee Foundation Model 4.5B — a dense 4.5B parameter model and proof-of-concept for the Trinity MoE series. Trained on 8T tokens. Extended to 64K context (from original 32K) in a subsequent update. Full open-source release July 2025 under Apache 2.0. Note: previously mislabeled as 'Trinity 4.5B' in some databases; it is a separate model family from the Trinity MoE line.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyAFM
Released2025-07-29
Parameters4.5B
Context66K
ArchitectureDense Transformer
LicenseApache 2.0
TrainingPretrained

Created by

Agentic AI workflows, efficient and secure

San Francisco, California, United States
Founded 2023
Website