LLM Reference
Upstage Console

SOLAR 10.7B Japanese on Upstage Console

Solar Mini · Upstage

Serverless

Pricing

TypePrice (per 1M)
Input tokens$0.30
Output tokens$0.30

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About SOLAR 10.7B Japanese

The SOLAR-10.7B Japanese language model boasts 10.7 billion parameters and is constructed on the Llama 2 architecture, integrating weights from Mistral 7B. It uses a technique called Depth Up-Scaling (DUS) to scale its depth in a straightforward manner compatible with standard frameworks. Tailored for single-turn conversations, the model excels at generating text and potentially code, rivaling even those with up to 30 billion parameters in certain benchmarks. Although its performance varies across tasks, and the details of its training data are limited, it remains a proficient tool in the realm of language models.

Get Started

Model Specs

Released2024-06-24
Parameters10.7B
ArchitectureDecoder Only

Related Models on Upstage Console