Yi 9B
About
Yi-9B is a comprehensive bilingual large language model tailored for both English and Chinese, developed by 01.AI. It comprises around 8.8 billion parameters and offers a context length of 4096 tokens. As an evolution of the Yi-6B model, Yi-9B was further trained with 0.8 trillion additional tokens, enhancing its total training data to 3.9 trillion tokens. This rigorous training regimen, which concluded in June 2023, has notably improved its proficiency in coding and mathematical tasks while preserving its dual-language strength. Yi-9B is accessible for both personal and commercial applications and can be found on Hugging Face.
Capabilities
MultimodalFunction CallingTool UseJSON Mode
Specifications
FamilyYi (2023/11)
Released2024-03-06
Parameters9B
ArchitectureDecoder Only
Specializationgeneral