Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Agent能力紧追Opus4.6,价格只要4%:Arcee开源Trinity Large Thinking
According to 1M AI News monitoring, the U.S. AI model company Arcee has released Trinity-Large-Thinking, an open-source reasoning model designed for long-duration Agent tasks. The model uses a Sparse Mixture of Experts (MoE) architecture, with a total of 400B parameters and only 13B activated parameters. It is released under the Apache 2.0 license for open-weight downloads on Hugging Face.
Unlike its predecessor Trinity-Large-Preview (pure instruction fine-tuning), Trinity-Large-Thinking performs reasoning thoughts before answering. It also shows improvements in multi-round tool calling, long-context coherence, and instruction-following ability. The core design goal is to maintain stable output during long-duration Agent loops.
On PinchBench, an Agent capability benchmark developed by Kilo, it scores 91.9, ranking second, only behind Opus 4.6’s 93.3. On the Agent task benchmark Tau2-Airline, it scores 88.0—the highest among all compared models. However, its performance on general reasoning benchmarks is average: it scores 76.3 on GPQA-D, below Kimi-K2.5 (86.9) and Opus 4.6 (89.2); it scores 83.4 on MMLU-Pro, also placing at the bottom. According to Arcee’s official description, this model is “the strongest open-source model outside China in many dimensions.”
Arcee API pricing is $0.90 per million tokens output. Arcee says it is about 96% cheaper than Opus 4.6. The model is also launched simultaneously on the AI model routing platform OpenRouter, where it has been available for free use in OpenClaw for the first 5 days. Since the end of January, the predecessor Preview has already served more than 3.37 trillion tokens on OpenRouter. It is the #1 open-source model by U.S. usage recorded on OpenClaw, and the #4 in the world. Preview will continue to be offered free on OpenRouter.