Futures
Access hundreds of perpetual contracts
CFD
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
GPT-5.5 just released 3 weeks ago! GPT-5.6 has already been integrated into Codex, with context length soaring to 1.5 million, and multiple sources are leaking that it will be released in June.
According to Beating Monitoring, just three weeks after the release of GPT-5.5, the successor GPT-5.6 has already been successfully run by external developers. Several developers, using ChatGPT Pro’s OAuth authentication, have successfully invoked the unreleased gpt-5.6 model within the Codex environment. Probe tests show that the context window has reached 1.5 million tokens, an approximately 43% increase over GPT-5.5 API’s 1.05 million tokens.
The earliest trace of GPT-5.6 appeared on April 28. Developer Haider discovered while reviewing Codex routing logs that most calls pointed to gpt-5.5, but one entry explicitly mapped to gpt-5.6. He then corrected his judgment, thinking it was more likely a canary test or a bug, as the entry quickly disappeared.
However, noticeable changes have occurred starting this week. Some developers found that last week, specifying gpt-5.6 still resulted in an “model is not supported” error, but this week, it passed directly using Pro’s OAuth. More importantly, the jump in context window size: GPT-5.5’s API context is 1.05M tokens, and with Codex OAuth, it was only 400K, but GPT-5.6’s probe directly reached 1.5M tokens, nearly 1.5 times larger. Developer tests in OpenCode also confirmed that the model still responds normally above 900K tokens, and requests over 1.05M tokens are also accepted. The model reports running on openai/gpt-5.6 in conversations, with inference levels set to xhigh, and fast mode available with very high speed.
Blogger Leo today stated that the development of GPT-5.6 has been fully advanced, with the first batch of checkpoints beginning internal testing in recent days, expected to be released next month, and two internal codenames, ember-alpha and beacon-alpha, have been revealed.
Haider analyzed OpenAI’s iteration pace: from annual updates to every six months, three months, two months, now shortened to 30-45 days. Based on this, GPT-5.6 is expected to be released in early June. He also predicts that GPT-5.6 will surpass Mythos on several benchmarks where GPT-5.5 still lags, reasoning that GPT-5.5 is already very close, and with more reinforcement learning, the gap can be widened. OpenAI’s RL cycles in coding, mathematics, and scientific research are stronger.
Polymarket currently estimates an approximately 85% probability that GPT-5.6 will be released before June 30.