LLM ReferenceLLM Reference

Qwen3.5-235B-A22B

About

Largest MoE model in Qwen3.5 series with 235B total parameters (22B active). Delivers frontier-level performance in reasoning, coding, and long-context tasks with inference efficiency.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Benchmark Scores(1)

BenchmarkScoreVersionSource
SWE-bench Pro35.2DAT-1778

Rankings

Specifications

FamilyQwen3.5
Released2026-02-24
Parameters235B
Context512k
ArchitectureMoE
Specializationgeneral

Created by

AI research institute of Alibaba Group.

Hangzhou, Zhejiang, China
Founded 2017
Website