Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
399 Yuan "AI Resurrects" Loved Ones, What Legal Risks Are Hidden?
399 Yuan “AI Resurrection” of Loved Ones, What Legal Risks Are Hidden?
Experts warn that if merchants exaggerate rough templated videos as “permanent companionship” effects or conceal data leakage risks, they may be suspected of fraud.
With the popularization of generative artificial intelligence technology, the mourning method of “digital resurrection” has become trendy online, but behind it lie multiple legal and ethical red lines such as infringement of portrait rights, data security risks, and consumer fraud.
On the eve of Qingming Festival, a reporter from the Workers’ Daily found that from netizens creating AI memorial videos out of nostalgia to e-commerce platforms clearly pricing customized services, “digital resurrection” is gradually shifting from pure emotional expression to commercialization. However, during rapid development, issues such as legal infringement, commercial fraud, and data breaches frequently occur, not only infringing on the personality rights of the deceased and the legitimate rights of consumers but also hiding potential telecom and internet scams.
Legal Protection for the Personality Rights of the Deceased
Recently, well-known education blogger Zhang Xuefeng passed away. Some netizens, out of remembrance, created AI memorial videos using his live broadcast footage, which were circulated online. There are also malicious actors who stole his portrait for false advertising. On March 24, Zhang Xuefeng’s affiliated company, Suzhou Fengyue Wanjuan Cultural Books Co., Ltd., issued an announcement to revoke all previously issued “portrait usage authorization letters” and demanded that partners remove all promotional materials and short videos using Zhang Xuefeng’s portrait, video clips, name, and related images within 24 hours.
In fact, controversy over “digital resurrection” of deceased public figures is not new. In October 2025, a well-known tea scholar who had passed away was “resurrected” using AI technology by a tea company to produce a commercial endorsement video. Although the company claimed to have obtained authorization from his son for promoting tea culture, the widow of the deceased explicitly opposed it, believing that the video used her late husband’s image for commercial promotion, which was a “distortion” and “insult.”
Earlier, some deceased celebrities also faced similar situations. Netizens, without family consent, used their pre-death stage performances, interview clips, private footage, and other materials to “resurrect” them via AI and produce related videos, which were widely circulated on short video platforms, sparking strong opposition from their families. Some family members publicly stated that such AI videos without authorization are disrespectful to the deceased and demanded the platforms remove all related content immediately.
Professor Shi Jiayou from Renmin University Law School pointed out that the personality rights of the deceased are protected by law, and infringement is not limited to commercial use. Even if AI resurrection videos are created and publicly shared for mourning purposes without close relatives’ consent, they may still infringe on the portrait rights and reputation rights of the deceased.
Merchants Targeting “Longing Marketing” for the Elderly
Ms. Li Huairong(, using a pseudonym, once commissioned an e-commerce merchant to produce a “digital resurrection” video of her late idol Leslie Cheung. “It was just because I missed him and wanted to hear him talk.” She contacted a merchant who offered a customized service called “Photo Speaking” priced at 49 yuan. The merchant only required her to provide one recognizable photo and a 15-second reference audio, and could make the person in the photo “speak” 50 words. Throughout the transaction, the merchant did not mention any legal risks or inform her how the data would be handled afterward.
The reporter noticed that on multiple e-commerce platforms, products labeled as “AI resurrection,” “making photos speak,” or “AI digital person” ranged in price from 10 yuan to several thousand yuan. However, the quality of these products varies greatly. Some merchants exploit emotional longing to market their products, turning sincere feelings into commodities, with potential inducement to overconsume or even fraud.
Li Huairong also tried purchasing another service from a different shop for 399 yuan to “resurrect” her mother. The merchant claimed it could make her mother “like she was in front of me, able to interact and video call.” However, the final video she received was not much different from the 49-yuan purchase—just the person in the photo moved their mouth and spoke longer, from 15 seconds to 1 minute, but it did not resemble her mother at all, making her feel the money was wasted.
Some merchants also target the less discerning elderly, using low-end generative AI technology at low cost for profit.
Professor Shi Jiayou stated that such “longing marketing” behaviors targeting the elderly can constitute consumer fraud under certain circumstances. If merchants exaggerate rough templated videos as “permanent companionship” effects, conceal the risks of data leakage, or provide seriously flawed emotional comfort services, leading the elderly to purchase based on false perceptions, it constitutes fraud. He recommends that market regulators establish algorithm ethics and data source filing systems for providers of “digital emotional services,” conduct special inspections in elderly gathering areas such as senior communities and funeral services, and focus on cracking down on “script-induced” sales tactics and “abusive clauses.”
Weak Regulation of Small-Workshop “AI Resurrection”
Beyond infringement disputes, the biological information such as photos and voices uploaded by users for “resurrecting” loved ones faces high risks of leakage and misuse. If merchants illegally buy and sell this sensitive data, it can easily flow into black and gray markets, potentially being exploited by criminals for AI face-swapping, voice-changing, and scams. According to reports and typical cases from anti-fraud departments in various regions, criminals have illegally obtained the deceased’s voice and photos to fabricate stories like debts left behind or “dream messages” for transfers, or tricked the elderly into trust through virtual images, leading to emotional and financial harm.
Ms. Li Huairong now recalls the process of using the “resurrection” service with fear: “The merchant didn’t tell me whether they would delete these photos and audio afterward, and I don’t know how my data was finally handled.”
Professor Shi Jiayou emphasized that the data uploaded by users—photos, recordings, etc., of the deceased—contain sensitive personal information. If misused, the consequences are severe. According to personal information protection law, platforms must obtain consent from the deceased’s close relatives before processing their personal information, and relatives have rights to access, copy, correct, or delete such data. Platforms should adopt encryption and anonymization measures to prevent leaks, and after the “AI resurrection” service ends, they must promptly delete the data and not retain user data for training other models. While existing regulations like the “Deep Synthesis Management Regulations for Internet Information Services” impose strong constraints on large platforms, there is still a lack of effective supervision and traceability mechanisms for many individual developers and small workshops offering “AI resurrection” services on e-commerce platforms.
For systematic governance of “digital resurrection” services, Professor Shi suggests building a multi-layered system combining “legal bottom line + technical standards + industry self-discipline + ethical review.” On the private law level, citizens can pre-arrange their “digital persona” through digital wills with legal binding. On the public law level, models providing “AI resurrection” must undergo safety assessments before deployment, and all AI-generated content must be clearly marked as “Generated by Artificial Intelligence.” Industry-wise, ethical conventions should be established, clarifying that “non-commercial and non-public dissemination” are bottom-line red lines, and digital traceability technology should be used to ensure infringing videos are trackable.
Source: Workers’ Daily
Mass information, precise interpretation, all in Sina Finance APP