Bittensor Subnet Completes Largest-Scale LLM Pretraining in History, DeAI Narrative Returns

robot
Abstract generation in progress

Crypto World News reports that on March 16, according to official sources, the Bittensor subnet Templar (SN3) completed the largest decentralized LLM Covenant-72B pretraining in history on March 10. Community supporters believe this event proves that Bittensor is not just a “concept coin,” but a decentralized infrastructure capable of producing top-tier AI models.

Covenant-72B is a language model with 72 billion parameters, pretrained by the Templar team on Bittensor Subnet 3, entirely based on the general internet without centralized data centers. The model achieved a score of 67.1 on the MMLU (zero-shot) test, surpassing centralized baseline models like LLaMA-2-70B and LLM360 K2 under the same evaluation conditions. It is the largest fully permissionless collaborative language model to date, with over 70 different nodes contributing computing resources throughout its operation. The team has released all weights and checkpoints under the Apache license.

Following this news, Bittensor (TAO) and its subnet tokens surged, with TAO increasing by 54.8% over the past two weeks. The subnet token Templar has surged 194% in the past 7 days, currently trading at $19.3.

TAO7.42%
DEAI3.88%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments