LLM Reference
Fireworks AI

Japanese StableLM 70B on Fireworks AI

Japanese StableLM · Stability AI

Provisioned

Pricing

TypePrice (per 1M)
Input tokens$0.90
Output tokens$0.90

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Japanese StableLM 70B

The Japanese StableLM 70B is a highly advanced large language model developed by Stability AI Japan, drawing from the Llama 2 architecture. It is purpose-built for tasks in the Japanese language, featuring an impressive 70 billion parameters, making it one of the largest open-source Japanese language models as of November 2023. This model excels in generating human-like Japanese text, effective question answering, and adeptly following instructions in Japanese. Its training encompassed approximately 100 billion tokens from wide-ranging Japanese and English sources, ensuring a robust linguistic capability. However, users should be mindful of its limitations, such as potential biases and contextual challenges due to its dependence on the quality of training data. Notably, the model is complemented by smaller, faster versions and a special "JA-Vocab Beta" version for enhanced performance.

Get Started

Model Specs

Released2023-04-10
Parameters70B
ArchitectureDecoder Only

Related Models on Fireworks AI