LLM Reference
NVIDIA NIM

SEA-LION 7B on NVIDIA NIM

SEA-LION · AI Singapore

Provisioned

Pricing

TypePrice (per 1M)
Input tokensFree
Output tokensFree

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About SEA-LION 7B

The SEA-LION 7B model is a state-of-the-art large language model designed for the Southeast Asian region, part of the SEA-LION family. This decoder-only transformer model boasts 7 billion parameters and is based on the MPT architecture. It features a custom SEABPETokenizer with a vocabulary of 256,000 tokens, optimizing performance for Southeast Asian languages. Supporting multiple languages like English, Chinese, and Indonesian, it excels in tasks such as question answering, sentiment analysis, machine translation, and text summarization. Trained on 980 billion tokens, the model effectively captures linguistic and cultural nuances, offering strong performance and accessible open-source resources 235.

Get Started

Model Specs

Released2024-09-01
Parameters7B
ArchitectureDecoder Only