Ant Ling-2.6-1T flagship with 1 trillion parameters open source: 1 trillion parameters activate 63 billion, MIT license

robot
Abstract generation in progress

CryptoWorld News: AntBailing (Inclusion AI) has officially open-sourced its flagship model, Ling-2.6-1T. The total weight parameters are 1 trillion, activating 6.3 billion (63B) per inference. It uses an MOE architecture, with a context length of 256K, and it is released under the MIT license. The Flash version of the model (104 billion parameters / 7.4 billion activations) is designed for lightweight, high-speed performance, while the 1T version is aimed at complex task scenarios. It has added a new “Fast Thinking” training strategy: by introducing a “Context Process Redundancy Suppression” reward, it compresses verbose thinking-chain outputs and reduces token consumption.

In terms of evaluations, it achieved 72.2% on SWE-Bench certification (the Flash version scored 61.2%), and performed excellently in benchmarks such as AIME 2026, BFCL-V4, TAU2-Bench, and IFBench. The model is compatible with mainstream Agent frameworks such as Claude Code, OpenClaw, and OpenCode. OpenRouter has launched a free API; deployment requires at least 8 GPUs, and it supports SGLang and VLLM.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments