vkhosla introduces the 1-bit Bonsai 8B model, with memory usage of only 1.15GB.

robot
Abstract generation in progress

ME News update, on April 1 (UTC+8), recently, the author vkhosla introduced a 1-bit weighted model called 1-bit Bonsai 8B on social media. According to the description, the model’s memory footprint is only 1.15 GB, and its intelligence density is claimed to be more than 10 times that of its full-precision counterpart model. The views in the article point out that on edge hardware, the model is 14 times smaller in size, 8 times faster, and 5 times more energy-efficient, while still remaining competitive compared with other models. The update also mentions that the model has better compressibility, faster speed, smaller size, and higher energy efficiency than the model announced by Google the previous day. The original text does not provide specific performance comparison benchmarks, the publisher, technical implementation details, or complete evaluation results. (Source: InFoQ)

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin