LLM ReferenceLLM Reference

SeaLLM 7B V2

About

SeaLLM-7B-v2 is a state-of-the-art, open-source multilingual large language model specifically crafted to support at least ten Southeast Asian languages, including both Latin and non-Latin scripts, thereby addressing their underrepresentation in existing models. This model excels in reasoning tasks, surpasses GPT-3.5 in certain benchmarks, and performs strongly in machine translation and instruction following. It is designed to be efficient, utilizing language-specific neuron training to reduce costs while maintaining high performance. Additionally, SeaLLM-7B-v2 includes safety measures to ensure trustworthy outputs, encouraging developers to conduct their own security checks to address any potential risks.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilySeaLLM 2
Released2024-07-01
Parameters7B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

AI research institute of Alibaba Group.

Hangzhou, Zhejiang, China
Founded 2017
Website