LLM Reference

MT0 XL

About

The MT0 XL is a multilingual large language model with 3.7 billion parameters, developed by the BigScience workshop. It is part of the BLOOMZ and mT0 model families and is designed to execute diverse tasks across various languages through zero-shot capability, achieved via multitask finetuning. This process involves training pre-existing multilingual models, BLOOM and mT5, on a wide range of cross-lingual tasks (xP3). The MT0 XL operates on an architecture similar to the mT5-xl model and has been fine-tuned over 10,000 steps using 1.85 billion tokens on TPUv4-128 hardware. It effectively handles tasks such as translation, question answering, and text generation, among others, making it suitable for applications requiring high linguistic versatility across different languages.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

FamilyMT0
Parameters3.7B
ArchitectureDecoder Only
Specializationgeneral