LLM Reference
NVIDIA NIM

Breeze 7B on NVIDIA NIM

Breeze · MediaTek-Research

Provisioned

Pricing

TypePrice (per 1M)
Input tokensFree
Output tokensFree

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Breeze 7B

Breeze-7B is an open-source large language model from MediaTek Research, engineered upon the Mistral-7B architecture. It excels in processing Traditional Chinese while also offering strong performance in English. Its 62,000-token vocabulary enhances comprehension and generation capabilities in Traditional Chinese, resulting in roughly twice the inference speed compared to similar models like Mistral-7B and Llama 7B. Breeze-7B includes multiple variants, such as a base model and instruction-tuned versions for tasks like question answering and summarization. Although a variant with a 64k-token context length was created, it was later removed due to performance issues. The model is competitive in benchmarks, notably those emphasizing Traditional Chinese.

Get Started

Model Specs

Released2023-11-10
Parameters7B
ArchitectureDecoder Only