LLM Reference

Swallow 13B

Open Source

About

Mixture-of-experts base model emphasizing low-latency inference for real-time applications. Enhanced Japanese/English bilingual understanding with custom architecture.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

Rankings

Specifications

FamilySwallow
Released2024-12-05
Parameters13000000000
Context8192