Have you ever stopped to think about how the narrative around the apocalypse has become profitable? I keep observing Sam Altman and see a pattern I can't ignore.



In 2016, the guy was 31 years old, president of Y Combinator, and already prepared for the end of the world — escape bags, weapons, gold, potassium iodide, even a plot of land in Big Sur ready to escape by plane. It seemed paranoid. But ten years later, it became a business model.

What intrigues me most is how he turned collective anxiety into power. While warning that AI would destroy humanity, he accelerated that very process. He claimed not to care about money, but built a personal investment empire. Sam Altman’s net worth reached approximately 2 billion dollars — not from direct shares of OpenAI, but from a network of strategic bets in Stripe, Reddit, nuclear fusion. Every preach about humanity’s future injected value into that empire.

The tactic is brilliant, actually. He sells a complete package: the fear of extinction by AI + the solution of redemption. Witnessing him testify before Congress saying people "should be glad" about the fear? Free advertising for OpenAI. Then comes Worldcoin with its iris scanning sphere — the perfect solution when panic is already installed in people's minds.

But what’s most fascinating is how he uses regulation as a weapon. When OpenAI was technically ahead, he called for heavy regulation to block competitors. After Google and Anthropic made advances, suddenly strict regulation was "disastrous" for innovation. Regulation isn’t a current — it’s a shield when you’re winning, an obstacle when you’re losing.

In November 2023, the board fired him for "lack of sincerity." But within five days, 95% of employees signed to bring him back. That’s not rational loyalty — it’s religious faith. Sam Altman’s personal net worth probably grew even more after that, because the market rewards those who can keep followers in trance.

What makes me think: he’s not an isolated case. Musk does the same — warns about demonic AI while building large-scale robotics. Zuckerberg spent 90 billion on the metaverse, failed, and now pivoted to AGI with the same salvific narrative. Peter Thiel built surveillance tools for governments while preparing for the apocalypse in New Zealand.

They’re all prophets of Silicon Valley’s production line. And the trick works because it hits three vulnerable points: first, it creates fear you can’t ignore; second, it monopolizes the explanation of that fear; third, it turns the "mission for humanity" into a religion that disables critical thinking.

Now, look at the irony: in February this year, he publicly declared he wouldn’t support AI in war. Weeks later, he signed a contract with the Pentagon. It’s not hypocrisy — it’s the business model working perfectly. Moral postures are part of the product. Military contracts are the real profit.

Sam Altman’s net worth of 2 billion didn’t come from symbolic salary; it came from understanding that the best way to profit in the future is to convince people that you’re the only one who sees it. Survival kit, apocalyptic shelter, investment empire — all consistent with the same logic: securing the position of winner in an uncertain future you’re accelerating.

And the most disturbing part? This model doesn’t depend on lies. It depends on a very precise understanding of how the human mind works under fear. Creates panic you can’t ignore, monopolizes who can explain, turns it into a sacred mission. Perfect.

The truly dangerous thing has never been the technology. It’s always been those who claim the right to define humanity’s destiny — and manage to profit from doing so.
WLD-0.34%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin