CT slept on the biggest decentralized AI paper of 2025


Jensen goes on All-In, CT apes into TAO
but the real work was already done
breakthrough in decentralized AI training already happened 9 months ago
In June 2025, @0G_labs published a paper on arXiv called DiLoCoX
a framework that trained a 107B parameter model across 20 decentralized nodes on standard 1 Gbps internet
result:
3,728 tokens/sec vs 10.4 for AllReduce
that's a 357x improvement in communication efficiency
framework validation but the only public benchmark at this efficiency level
and basically no one covered it
- Bittensor's Covenant-72B is one trained model
- DiLoCoX is a framework for training any model on decentralized infrastructure completely different category
behind it is a full stack: Compute, Storage, DA and Chain no other project ships all four layers
Jensen just validated the thesis 0G Labs already proved a year ago
April 1st they're speaking at EthCC Cannes next chapter is coming
TAO0.91%
0G-0.46%
post-image
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin