
Yi
About
The Yi family of large language models (LLMs), developed by 01.AI, are sophisticated language and multimodal models known for their comprehensive functionality across various dimensions 1812. Built on foundational models with 6 billion and 34 billion parameters, they incorporate advancements such as chat capabilities, models with 200K token context windows, depth-upscaled models, and vision-language processing 1812. The Yi models are designed to be bilingual in English and Chinese, achieving top performance in open-source model rankings for language understanding, commonsense reasoning, reading comprehension, and code generation 127. Available on platforms like Hugging Face and ModelScope, these models support academic research and commercial applications, aided by their open-source nature and 01.AI's focus on high-quality data engineering 12. The series also includes the multimodal Yi-VL models, excelling at tasks that involve both text and images 34, as well as the Yi-Coder models, which are optimized for code generation, editing, and long-context comprehension 9.