Elon Musk says Colossus 2 is simultaneously training 7 models, with a maximum of 10 trillion parameters.

AIMPACT News, April 8 (UTC+8), SpaceX and xAI founder Elon Musk posted on X that the SpaceXAI supercomputing cluster Colossus 2 is currently training 7 models simultaneously:


  1. Imagine V2, the next-generation version of the image and video generation model

  2. Two variants of a 1 trillion parameter model

  3. Two variants of a 1.5 trillion parameter model

  4. A 6 trillion parameter model

  5. A 10 trillion parameter model


He added, “There are still some catching up to do.” Previously, multiple media outlets reported that xAI’s next-generation flagship model Grok 5 has approximately 6 trillion parameters, which matches the 6T in the list. The 10 trillion parameter model has not been publicly reported before.


In the post, Musk referred to the entity formed by the merger of SpaceX and xAI as “SpaceXAI.” The two companies completed their merger in February this year, with a combined valuation of about $1.25 trillion. Colossus 2 went online on January 17, and is the world’s first gigawatt-level AI training cluster, planned to be upgraded to 1.5GW this month. (Source: X platform)



View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments