Cogito v2 Preview DeepSeek 671B MoE
cogito-v2-preview-deepseek-671b-moe
Open Source
About
Cogito v2 Preview DeepSeek 671B MoE is a 671B Mixture-of-Experts hybrid reasoning model from Deep Cogito, fine-tuned from DeepSeek V3 using Iterated Distillation and Amplification (IDA). Released July 2025. Uses 60% shorter reasoning chains than DeepSeek R1 while achieving comparable performance. License not confirmed from primary sources.
Capabilities
VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode ExecutionPrompt CachingBatch APIAudioFine-tuning