LLM Reference

Japanese StableLM 70B

About

The Japanese StableLM 70B is a highly advanced large language model developed by Stability AI Japan, drawing from the Llama 2 architecture. It is purpose-built for tasks in the Japanese language, featuring an impressive 70 billion parameters, making it one of the largest open-source Japanese language models as of November 2023. This model excels in generating human-like Japanese text, effective question answering, and adeptly following instructions in Japanese. Its training encompassed approximately 100 billion tokens from wide-ranging Japanese and English sources, ensuring a robust linguistic capability. However, users should be mindful of its limitations, such as potential biases and contextual challenges due to its dependence on the quality of training data. Notably, the model is complemented by smaller, faster versions and a special "JA-Vocab Beta" version for enhanced performance.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
Fireworks AI Platform
Provisioned

Specifications

Parameters70B
ArchitectureDecoder Only
Specializationgeneral