LLM Reference

Qwen1.5-110B on Together AI

Qwen1.5 · Alibaba

Serverless

Pricing

TypePrice (per 1M)
Input tokens$1.80
Output tokens$1.80

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Qwen1.5-110B

The Qwen1.5-110B is a large language model created by Alibaba Cloud, distinguished as the largest in the Qwen1.5 series. It is a transformer-based, decoder-only model equipped with 110 billion parameters and optimized for efficiency using features like SwiGLU activation and Grouped Query Attention (GQA). Pretrained on an extensive dataset, it supports a 32K context length and multilingual capabilities, handling various languages including English and Chinese. The model excels in tasks like text generation, dialogue systems, and is noted for its competitive performance and advanced tokenizer, making it highly versatile and applicable across multiple NLP tasks. Various quantized versions are available to accommodate different hardware specifications.

Get Started

Model Specs

Released2024-04-25
Parameters110B
ArchitectureDecoder Only