LLM Reference
Fireworks AI

Japanese StableLM Gamma 7B on Fireworks AI

Japanese StableLM · Stability AI

Provisioned

Pricing

TypePrice (per 1M)
Input tokens$0.20
Output tokens$0.20

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Japanese StableLM Gamma 7B

The Japanese StableLM Base Gamma 7B is a specialized language model developed for Japanese text processing, leveraging its decoder-only transformer architecture to predict the next word in a sequence effectively. Trained on an extensive dataset of around 100 billion Japanese tokens, including diverse sources like Wikipedia and OSCAR, it supports a wide range of Japanese language styles. While excelling in generating coherent text and handling language modeling tasks, it also holds potential for fine-tuning across various applications such as text classification and machine translation. Despite its strengths, the model requires careful deployment due to potential biases in training data and its limitations in contextual comprehension compared to encoder-decoder models.

Get Started

Model Specs

Released2023-04-10
Parameters7B
ArchitectureDecoder Only

Related Models on Fireworks AI