LLM Reference

SOLAR 10.7B

About

SOLAR 10.7B is a robust large language model created by Upstage AI in South Korea, featuring 10.7 billion parameters. It is tailored for high efficiency and performance through its innovative "Depth Up-Scaling" (DUS) approach, which deepens the model's layers rather than widening them, allowing for enhanced capabilities without significantly increasing computational costs. This method distinguishes it from other models that utilize more complex techniques like Mixture of Experts. By integrating pre-trained weights from the Mistral 7B model with the Llama 2 framework, SOLAR 10.7B achieves notable performance, outpacing even some models with up to 30 billion parameters. Available under the Apache 2.0 license, it also includes a finely-tuned instruction-based variant under CC-BY-NC-4.0, optimized for single-turn conversations and diverse NLP tasks, albeit with limitations in handling multi-turn dialogue and complex context. The model is grounded in the transformer architecture, widely adopted in advanced language models.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(4)

ProviderInput (per 1M)Output (per 1M)Type
NVIDIA NIM
Provisioned
Together AI API$0.3$0.3
Serverless
Upstage Console
Serverless
Azure OpenAI
Provisioned

Specifications

Parameters10.7B
ArchitectureDecoder Only
Specializationgeneral