Mamba 2 1.3B
About
Mamba 2 1.3B is an advanced large language model that excels in handling long sequences efficiently, thanks to its hybrid architecture combining state-space models and traditional neural network structures. Comprising approximately 1.3 billion parameters, it integrates feedforward and sliding window attention layers, reducing computational overhead. The model showcases strong capabilities in text generation and language understanding, particularly excelling in tasks that require processing extensive documents or multi-turn dialogues. Despite these strengths, it faces challenges with in-context learning and biases in its training data, as well as significant resource requirements to run efficiently.
Capabilities
MultimodalFunction CallingTool UseJSON Mode