LLM ReferenceLLM Reference

Cogito v2 Preview Llama 109B MoE

cogito-v2-preview-llama-109b-moe

Open Source

About

Cogito v2 Preview Llama 109B MoE is a 109B Mixture-of-Experts hybrid reasoning model from Deep Cogito, fine-tuned from a Llama MoE base using Iterated Distillation and Amplification (IDA). Released July 2025. License not confirmed from primary sources.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning

Rankings

Specifications

FamilyCogito
Released2025-07-31
Parameters109B
Architecturemoe
Specializationgeneral
Trainingfinetuned

Created by

Building general superintelligence through advanced reasoning and iterative self-improvement.

San Francisco, California, United States
Founded 2024
Website