Musk Claims Colossus 2 is Training 7 Models Simultaneously, with a Maximum of 10 Trillion Parameters

robot
Abstract generation in progress

According to monitoring by 1M AI News, Elon Musk, founder of SpaceX and xAI, posted on X that the SpaceXAI supercomputing cluster Colossus 2 is currently training 7 models simultaneously: 1. Imagine V2, the next generation of image and video generation models; 2. Two variants of a 1 trillion parameter model; 3. Two variants of a 1.5 trillion parameter model; 4. A 6 trillion parameter model; 5. A 10 trillion parameter model. He added, “Some catching up to do.” Previously, several media outlets reported that xAI’s next-generation flagship model Grok has a parameter scale of about 6 trillion, which aligns with the 6T listed. The 10 trillion parameter model has not been publicly reported before. In his post, Musk referred to the merged entity as “SpaceXAI.” The two companies completed their merger in February of this year, with a post-merger valuation of approximately $1.25 trillion. Colossus 2 was launched on January 17 and is the world’s first gigawatt-level AI training cluster, with plans to upgrade to 1.5GW this month.

XAI-1.23%
GROK-0.86%
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments