LLM ReferenceLLM Reference

Breeze 8x7B

About

Breeze 8x7B does not appear to be an existing large language model according to the available information. However, related models from MediaTek Research include Breeze-7B and Breexe-8x7B. Breeze-7B is an open-source model based on Mistral-7B, optimized for Traditional Chinese language comprehension and chatbot functions, displaying strong benchmark performance after extensive pretraining and fine-tuning. Breexe-8x7B, derived from Mixtral-8x7B, is specifically tailored for Traditional Chinese applications and includes the Breexe-8x7B-Instruct variant, which matches OpenAI's GPT-3.5-turbo-1106 in benchmark performance and operates at double the inference speed of Mixtral-8x7B. Both models improve performance through techniques such as vocabulary expansion, yet a Breeze 8x7B model description requires additional details.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyBreeze
Released2023-11-10
Parameters7B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Pioneering edge AI advancements in research

Hsinchu, Taiwan
Founded 1997
Website