LLM ReferenceLLM Reference

BGE Multilingual Gemma2

bge-multilingual-gemma2

About

BGE Multilingual Gemma2 is BAAI's large multilingual embedding model built on Google's Gemma 2 9B decoder architecture (42 layers, 3584 hidden dim). It supports instruction-based encoding for retrieval tasks and achieves state-of-the-art results on MIRACL, MTEB-pl, and MTEB-fr benchmarks, with strong performance across MTEB, C-MTEB, and AIR-Bench. It covers diverse languages including English, Chinese, Japanese, Korean, and French, and supports a 4,096-token context window.

BGE Multilingual Gemma2 has a 4K-token context window.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyBGE
Released2024-06-29
Parameters9B
Context4K
Architecturedecoder
Specializationembedding
LicenseGemma
Trainingpretrained

Created by

Open-source AI fostering global collaboration

Beijing, China
Founded 2018
Website