Stable LM 2 12B
About
Stable LM 2 12B, developed by Stability AI, is a highly capable large language model featuring 12.1 billion parameters and trained on an extensive dataset of 2 trillion tokens. It supports multiple languages, including English, Spanish, German, Italian, French, Portuguese, and Dutch, due to its diverse multilingual training. The model is available in both a base and an instruction-tuned variant, enabling robust tool usage and function calling, particularly for systems like retrieval augmented generation (RAG). Notably open-source, it facilitates both commercial and non-commercial use. Its design balances efficiency with performance, making it adaptable to varied hardware constraints. While it excels in text and code generation, it also addresses complex tasks traditionally handled by larger models. Furthermore, ongoing development promises future enhancements, such as a planned long-context variant. Users should be aware of potential biases, as with all LLMs.