LLM Reference

SEA-LION 7B

About

The SEA-LION 7B model is a state-of-the-art large language model designed for the Southeast Asian region, part of the SEA-LION family. This decoder-only transformer model boasts 7 billion parameters and is based on the MPT architecture. It features a custom SEABPETokenizer with a vocabulary of 256,000 tokens, optimizing performance for Southeast Asian languages. Supporting multiple languages like English, Chinese, and Indonesian, it excels in tasks such as question answering, sentiment analysis, machine translation, and text summarization. Trained on 980 billion tokens, the model effectively captures linguistic and cultural nuances, offering strong performance and accessible open-source resources 235.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
NVIDIA NIM
Provisioned

Specifications

FamilySEA-LION
Parameters7B
ArchitectureDecoder Only
Specializationgeneral