xAI is reportedly training seven large language models with parameter scales spanning a wide range simultaneously.

robot
Abstract generation in progress

ME News Report, April 9 (UTC+8), according to social media information, xAI is currently training seven different large language models simultaneously, with parameter scales covering a wide range. Specifically, these models include Imagine V2, two variants with 1 trillion parameters, two variants with 1.5 trillion parameters, a 6 trillion parameter model, and a 10 trillion parameter model. This information was posted by Twitter user WesRoth and references the progress of the SpaceXAI Colossus 2 project. No further technical details, release times, or performance metrics are provided in the report. (Source: InFoQ)

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin