Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Futures Kickoff
Get prepared for your futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to experience risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Deepfake videos with artificial backgrounds: a new weapon of North Korean cybercriminals against the crypto industry
Cyber threats have reached a new level: the Lazarus Group, associated with North Korea and known as BlueNoroff, is actively employing advanced artificial intelligence technologies to create fake video calls with realistic backgrounds. These attacks target crypto industry professionals and demonstrate how dangerous the combination of deepfake technology and social engineering has become.
Deception Tactics: Fake Background Videos as Manipulation Tools
According to research firm Odaily, hackers initiate video calls through compromised Telegram accounts, where they use AI-generated videos with fake backgrounds, presenting them as trusted individuals of the victim. Co-founder of BTC Prague, Martin Kuhar, shared alarming details: the attackers persuade users to install malicious software disguised as a Zoom plugin supposedly to fix sound issues. Realistic videos with appropriate backgrounds significantly increase the victim’s trust, making social engineering much more effective.
Technical Arsenal: What Happens After Installing the Malware
Once the victim installs the disguised malware, cybercriminals gain full control over the device. Malicious scripts perform multi-layered infections on macOS, deploying backdoors, keyloggers, and intercepting clipboard contents. Particularly dangerous is that the malware can access encrypted wallets and users’ private keys.
Security company Huntress has identified that this methodology is closely linked to previous operations by the same group targeting cryptocurrency developers. Researchers from SlowMist confirmed that the attacks demonstrate a clear pattern of reusing tactics with adaptations for specific crypto wallets and targeted professionals.
Why Deepfake Video Is a New Threat to Identity Verification
The proliferation of technology for creating fake videos and voice cloning fundamentally changes the digital security landscape. Traditional methods of verifying identity via video calls are now unreliable, as backgrounds, facial expressions, and voices can be perfectly simulated by AI. This presents a fundamental challenge for the industry.
How to Protect Crypto Assets: Practical Recommendations
Crypto professionals must immediately strengthen their cybersecurity measures. It is essential to implement multi-factor authentication, use security keys instead of SMS codes, and most importantly, conduct out-of-band verification of any video call requests through alternative communication channels. Additionally, avoid installing software recommended by strangers, even if the video call looks convincing with a proper background. Vigilance and awareness remain the best defenses against the growing threat of deepfake attacks.