Pricing
| Type | Price (per 1M) |
|---|---|
| Input tokens | $1.80 |
| Output tokens | $1.80 |
Capabilities
VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution
About MT0 XXL
The MT0 XXL model, part of the BLOOMZ and mT0 family developed by the BigScience workshop, is a large language AI model designed for multilingual applications with zero-shot capabilities. It features 13.9 billion parameters and is built on a transformer architecture with an encoder-decoder structure. Trained with a multitask finetuning approach using the xP3 dataset, MT0 XXL excels in tasks such as translation, question answering, text generation, and summarization. Although powerful, the model's performance hinges on prompt specificity and the diversity of its training data. Running the model can be resource-intensive, which might limit its accessibility for smaller entities 1.
Model Specs
Released2024-01-01
Parameters13B
ArchitectureDecoder Only