đ The #CandyDrop Futures Challenge is live â join now to share a 6 BTC prize pool!
đą Post your futures trading experience on Gate Square with the event hashtag â $25 Ă 20 rewards are waiting!
đ $500 in futures trial vouchers up for grabs â 20 standout posts will win!
đ
Event Period: August 1, 2025, 15:00 â August 15, 2025, 19:00 (UTC+8)
đ Event Link: https://www.gate.com/candy-drop/detail/BTC-98
Dare to trade. Dare to win.
đŹ Vitalik Buterin warns against AI advances.
AI could overtake humans as a "superior species."
"One of the ways AI could make the world worse is the worst possible: it could literally cause the extinction of humanity."
Vitalik cites a 2022 survey of more than 4,270 researchers, which estimated the risk of AI killing humanity at 5-10%.
"Even Mars may not be safe if superintelligent AI turns against humanity."
For Vitalik, AI is "fundamentally different" from other recent inventions because AI can create a new kind of "mind" that can backfire on human interests.
"AI is a new type of mind that is rapidly gaining intelligence and has a serious chance of surpassing humans' mental faculties and becoming the new umbrella species of the planet."
Vitalik also suggested "active human intent" to steer AI in a direction beneficial to humanity.