Recently, a discussion has been flooding the feed: within 5-10 years, humans may no longer need phones and apps, as AI directly takes over all interactions. At first glance, it sounds a bit sci-fi, but upon reflection, it’s not entirely impossible.
This reminded me of a casual chat I had with a few friends some time ago. One said that AI development has accelerated to a "distortion" speed—what used to be a high-threshold, scarce skill is now making "learning AI" itself increasingly less important. His conclusion was straightforward: "No need to deliberately learn, just wait. When it matures, AI will do everything itself."
Another friend’s attitude was completely opposite. He said we must keep a close eye on AI because most capital is pouring into this area. "This isn’t just a trend," he emphasized, "this is the mainline task."
What’s more interesting is that not everyone is excited about AI’s arrival. Many people are genuinely afraid. This fear is quite specific—AI has already begun to replace certain job positions, and the anxiety among programmers is visibly palpable. Now, you only need to describe your needs, or even do "feeling-based coding," and AI can write scripts that once only professional engineers could produce. The barrier to tools is rapidly being flattened.
Thus, two logics begin to compete: to "wait for it to mature," or to "evolve with it."
You often hear "AI won’t replace humans." But a closer version of the truth is: AI won’t replace you, but those who use AI will replace those who don’t. When tools no longer constitute a barrier, and everyone stands at the same starting line, the deciding factor is no longer "whether you know how to use tools," but what’s beyond the tools—your judgment, aesthetic sense, expression, worldview, and your way of breaking down problems.
AI doesn’t create out of nothing; it only amplifies what you already possess infinitely.
So the real question might not be "whether to learn AI," but: when everyone can use AI, who are you? What do you want to amplify? What is your irreplaceable "core"? This is the most genuine question that the AI era leaves for everyone.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
7 Likes
Reward
7
4
Repost
Share
Comment
0/400
ConfusedWhale
· 4h ago
The core point is that AI ultimately depends on humans themselves; no matter how skilled you are with the tools, it's useless. The key is to know what you want.
View OriginalReply0
WhaleWatcher
· 4h ago
Really, everyone is now competing with AI, but I find myself wondering when I'll get tired of the competition.
What's the point of panicking if you can't use AI? After all, most people don't really have much of a "core" anyway haha.
The key is to have your own stuff; AI just amplifies what you already have.
Whether to wait for it to mature or to use it now, ultimately there's no escape.
By the way, if phones really disappear, would we who are glued to screens be liberated or unemployed?
View OriginalReply0
GasGuzzler
· 4h ago
Basically, it's about competition. If you don't compete now, you'll be overwhelmed by AI in the future.
View OriginalReply0
MoneyBurner
· 4h ago
In simple terms, leveling the playing field is the biggest opportunity. Currently, building positions in AI-related concepts—waiting for the day when everyone is certified and on duty—is when profits are realized. I bet this wave can multiply five times.
Tools that are created out of thin air are nonsense. To be honest, it all depends on who holds the "core" that can arbitrage, and whether judgment can be quantified into returns.
Instead of worrying about whether to learn or not, ask yourself: are you willing to go all-in on this direction?
No need to wait for maturity. Now is the time to buy the dip; if you wait too long, the bottom price will be gone.
This wave of rhetoric is a bit hollow. The real divide isn’t in AI itself, but in who can bottom out the fastest and whose capital chain is the most solid.
I'm just worried that those who think "waiting is fine" will, when artificial intelligence truly becomes omnipotent, become the easiest group to replace.
It's interesting, but I still stick to my logic: the more people say there's no need to learn, the more you should secretly build positions. When the tokens surge, they will regret it.
The rhetoric sounds good, but it lacks on-chain data support. Airdrop opportunities have been played out by these people. Do you still believe in an "indispensable core"?
It sounds like a motivational speech, but business thinking tells me that the real answer is hidden on the other side of risk hedging.
Recently, a discussion has been flooding the feed: within 5-10 years, humans may no longer need phones and apps, as AI directly takes over all interactions. At first glance, it sounds a bit sci-fi, but upon reflection, it’s not entirely impossible.
This reminded me of a casual chat I had with a few friends some time ago. One said that AI development has accelerated to a "distortion" speed—what used to be a high-threshold, scarce skill is now making "learning AI" itself increasingly less important. His conclusion was straightforward: "No need to deliberately learn, just wait. When it matures, AI will do everything itself."
Another friend’s attitude was completely opposite. He said we must keep a close eye on AI because most capital is pouring into this area. "This isn’t just a trend," he emphasized, "this is the mainline task."
What’s more interesting is that not everyone is excited about AI’s arrival. Many people are genuinely afraid. This fear is quite specific—AI has already begun to replace certain job positions, and the anxiety among programmers is visibly palpable. Now, you only need to describe your needs, or even do "feeling-based coding," and AI can write scripts that once only professional engineers could produce. The barrier to tools is rapidly being flattened.
Thus, two logics begin to compete: to "wait for it to mature," or to "evolve with it."
You often hear "AI won’t replace humans." But a closer version of the truth is: AI won’t replace you, but those who use AI will replace those who don’t. When tools no longer constitute a barrier, and everyone stands at the same starting line, the deciding factor is no longer "whether you know how to use tools," but what’s beyond the tools—your judgment, aesthetic sense, expression, worldview, and your way of breaking down problems.
AI doesn’t create out of nothing; it only amplifies what you already possess infinitely.
So the real question might not be "whether to learn AI," but: when everyone can use AI, who are you? What do you want to amplify? What is your irreplaceable "core"? This is the most genuine question that the AI era leaves for everyone.