MT0 Small
About
The MT0 Small model, part of the BLOOMZ & mT0 family by the BigScience workshop, is a multilingual AI language model with 300 million parameters. Designed for zero-shot cross-lingual generalization, it can understand and follow human instructions in multiple languages without prior training. The model is fine-tuned using the cross-lingual task mixture dataset (xP3) and employs multitask finetuning for efficient multilingual performance. While optimized for English prompts, MT0 Small exhibits robust cross-lingual text generation capabilities, making it suitable for diverse language tasks. Its compact size ensures efficiency in resource-limited settings.
Capabilities
MultimodalFunction CallingTool UseJSON Mode