LLM Reference

Japanese StableLM Alpha 7B

About

The Japanese StableLM Alpha 7B is a large language model crafted for Japanese text generation and comprehension. Created by Stability AI, this 7-billion parameter model is based on the NeoX transformer architecture, enabling the production of high-quality, fluent Japanese text. With training on a vast dataset comprising predominantly Japanese and English texts and a small portion of source code, the model adeptly handles code-switching between these languages. It is geared towards tasks like creative writing, dialogue generation, and summarization, though it may reflect biases from its training data. Additionally, there's a refined version, Japanese StableLM Instruct Alpha 7B, which excels in adhering to instruction and generating context-relevant responses.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

Parameters7B
ArchitectureDecoder Only
Specializationgeneral