Tongyi Qianwen Open Source Qwen3.6-27B, the 27B dense model's encoding ability surpasses the previous generation 397B flagship.

According to Dongcha Beating Monitoring, Alibaba Tongyi Qianwen team has open-sourced Qwen3.6-27B, a dense multimodal model with 27 billion parameters, primarily focused on coding agent capabilities. This is the third member of the Qwen3.6 series, following the API version Qwen3.6-Plus and the low-activation MoE version Qwen3.6-35B-A3B, with weights already released on Hugging Face and ModelScope.

The core selling point is that the 27B dense architecture fully surpasses the previous generation open-source flagship Qwen3.5-397B-A17B (397B total parameters, 17B activated MoE model). In coding agent benchmarks, SWE-bench Verified scores 77.2 versus 76.2, SWE-bench Pro scores 53.5 versus 50.9, Terminal-Bench 2.0 scores 59.3 versus 52.5, SkillsBench scores 48.2 versus 30.0. In reasoning tasks, GPQA Diamond score reaches 87.8, close to models with several times more parameters. For visual agents, AndroidWorld scores 70.3, higher than Qwen3.5-27B’s 64.2.

The model natively supports image and video inputs, with thinking and non-thinking modes sharing the same set of weights. The dense architecture does not involve MoE routing, making deployment simpler than the 397B MoE. Official documentation indicates it can directly connect to three terminal encoding tools: OpenClaw, Claude Code, and Qwen Code. The API will be launched on Alibaba Cloud’s Bailian platform.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin