LLM ReferenceLLM Reference

Cogito v2 Preview DeepSeek 671B MoE

cogito-v2-preview-deepseek-671b-moe

Open Source

About

Cogito v2 Preview DeepSeek 671B MoE is a 671B Mixture-of-Experts hybrid reasoning model from Deep Cogito, fine-tuned from DeepSeek V3 using Iterated Distillation and Amplification (IDA). Released July 2025. Uses 60% shorter reasoning chains than DeepSeek R1 while achieving comparable performance. License not confirmed from primary sources.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning

Rankings

Specifications

FamilyCogito
Released2025-07-31
Parameters671B
Architecturemoe
Specializationgeneral
Trainingfinetuned

Created by

Building general superintelligence through advanced reasoning and iterative self-improvement.

San Francisco, California, United States
Founded 2024
Website