The AI pie is growing so fast it’s easy to miss trends within the trend


One worth watching:
Serious products are stacking proprietary RL on open-weight bases instead of just routing to frontier APIs
They’re bringing post-training in house and getting better results than just routing to frontier APIs
Cursor’s Composer 2 is the best example
25% of compute on the Kimi base and 75% on Cursor’s own RL using their coding trajectories
Many others are doing this quietly, many more will go this route
The base model is the commodity and the post-training is the moat
The only complication is that the post-training is expensive, so not something many companies can afford
Companies that make this process more affordable and accessible will have an insanely valuable offering
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pinned