LLM ReferenceLLM Reference

Breeze 7B

About

Breeze-7B is an open-source large language model from MediaTek Research, engineered upon the Mistral-7B architecture. It excels in processing Traditional Chinese while also offering strong performance in English. Its 62,000-token vocabulary enhances comprehension and generation capabilities in Traditional Chinese, resulting in roughly twice the inference speed compared to similar models like Mistral-7B and Llama 7B. Breeze-7B includes multiple variants, such as a base model and instruction-tuned versions for tasks like question answering and summarization. Although a variant with a 64k-token context length was created, it was later removed due to performance issues. The model is competitive in benchmarks, notably those emphasizing Traditional Chinese.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Providers(1)

ProviderInput (per 1M)Output (per 1M)Type
NVIDIA NIMProvisioned

Rankings

Specifications

FamilyBreeze
Released2023-11-10
Parameters7B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Pioneering edge AI advancements in research

Hsinchu, Taiwan
Founded 1997
Website

Providers(1)