LLM ReferenceLLM Reference

MT0 XL

About

The MT0 XL is a multilingual large language model with 3.7 billion parameters, developed by the BigScience workshop. It is part of the BLOOMZ and mT0 model families and is designed to execute diverse tasks across various languages through zero-shot capability, achieved via multitask finetuning. This process involves training pre-existing multilingual models, BLOOM and mT5, on a wide range of cross-lingual tasks (xP3). The MT0 XL operates on an architecture similar to the mT5-xl model and has been fine-tuned over 10,000 steps using 1.85 billion tokens on TPUv4-128 hardware. It effectively handles tasks such as translation, question answering, and text generation, among others, making it suitable for applications requiring high linguistic versatility across different languages.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyMT0
Released2024-01-01
Parameters3.7B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

Pioneering open-source AI collaboration

N/A
Founded 2021
Website