Tongyi Laboratory announces open source of Qwen3.6-35B-A3B

robot
Abstract generation in progress

ME News Report, April 16 (UTC+8), Tongyi Laboratory announced the open source of Qwen3.6-35B-A3B. This is an efficient model using a sparse mixture of experts (MoE) architecture: a total of 35 billion parameters, with only 3 billion parameters activated during each inference. Although the number of parameters is small, its performance remains strong. In intelligent agent programming, it significantly surpasses the previous generation Qwen3.5-35B-A3B and can compete with larger dense models such as Qwen3.5-27B and Gemma-31B. (Source: BlockBeats)

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pinned